- 
          
 - 
                Notifications
    
You must be signed in to change notification settings  - Fork 782
 
✨ Automatically map dictionaries to JSON #1599
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
✨ Automatically map dictionaries to JSON #1599
Conversation
1fa6323    to
    917808f      
    Compare
  
    c4312f9    to
    8d71048      
    Compare
  
    There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would be nice to have such native support for dict!
As an improvement idea, we can support automatic mapping to JSON for pydantic.Json type, BaseModel, and also for list (we can map it to JSON by default, but it will still be possible to change mapping to sqlalchemy.dialects.postgresql.ARRAY using the sa_type parameter).
One potential problem we should probably think about:
If we define our field as dict[str, int] it will not set any DB constraints and it will be possible to have invalid data in the DB (e.g.  {"param": "text"}).
Code example in the details
from sqlmodel import Field, Session, SQLModel, create_engine, text
class MyObject(SQLModel, table=True):
    id: int = Field(primary_key=True)
    params: dict[str, int]
engine = create_engine("sqlite:///")
SQLModel.metadata.create_all(engine)
# Imagine this was added before, probably by external
with Session(engine) as session:
    session.exec(
        text('INSERT INTO myobject (id, params) VALUES (1, \'{"param": "text"}\')')
    )
    session.commit()
with Session(engine) as session:
    obj = session.get(MyObject, 1)
    assert isinstance(obj.params, dict)
    assert isinstance(obj.params["param"], int), type(obj.params["param"])
    # AssertionError: <class 'str'>When we fetch such row, SQLModel will not complain that type is wrong, but users may have errors in their code if they assume that fetched data has valid type.
Such errors might be tricky to debug - static type checkers will not show any warnings.
As a solution, we can probably go extra mile and implement a new type (using [TypeDecorator](https://docs.sqlalchemy.org/en/20/core/custom_types.html)) that will not just convert data to JSON, but also ensure data has valid type.
PS: ready to help with it
| def test_type_dict_breaks() -> None: | ||
| with pytest.raises(ValueError): | ||
| 
               | 
          ||
| class Hero(SQLModel, table=True): | ||
| id: Optional[int] = Field(default=None, primary_key=True) | ||
| tags: Dict[str, Any] | ||
| 
               | 
          ||
| 
               | 
          
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This will still fail with Pydantic V1.
I suggest we keep with test, but just mark it with needs_pydanticv2
| This fixes the original error: | ||
| ValueError: <class 'app.models.NeonMetadata'> has no matching SQLAlchemy type | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| This fixes the original error: | |
| ValueError: <class 'app.models.NeonMetadata'> has no matching SQLAlchemy type | 
I think we don't need this part here
| from .conftest import needs_pydanticv2 | ||
| 
               | 
          ||
| 
               | 
          ||
| def test_type_mappings(clear_sqlmodel): | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As I understand, the idea of this test module is to test all type mapping in one module.
Then we should probably add tests for:
Enumipaddress.IPv4Addressipaddress.IPv4Networkipaddress.IPv6Addressipaddress.IPv6NetworkPathEmailStrtimedeltaDecimaluuid.UUID
| 
           @YuriiMotov happy for you to add support for   | 
    
| 
           @YuriiMotov also doesn't seems like SQLModel does validation for any incoming data? it would be a different behaviour from the rest?  | 
    
          
 It's true, but..  | 
    
| 
           As an idea: we can add such validation for  For example class MyModel(SQLModel, table=True):
    f1: dict[str, int]  # DB data will be validated
    f2: dict[str, int] = Field(sa_type=JSON)  # DB data will not be validated | 
    
          
 I've had good luck using Pydantic's   | 
    
No description provided.