pydantic-ai

Master Pydantic: Data Validation for Python

Build robust, self-validating Python data models with Pydantic. From BaseModel fundamentals to production API integration.

4 modules · 16 lessons · free to read

What you'll learn

  • Build data models using Pydantic BaseModel with type annotations and automatic validation
  • Create custom validators to enforce complex business rules and data constraints
  • Configure models for different use cases using model_config and validators
  • Serialize and deserialize data in multiple formats (JSON, Python, custom)
  • Integrate Pydantic with FastAPI and databases for production-ready applications

01Pydantic Foundations

Master the fundamentals of Pydantic BaseModel, automatic type validation, and field configuration. Build your first self-validating data models.

1.Your First BaseModel

A BaseModel is a class that automatically validates data when you create an instance from it. Every field is defined with a Python type annotation, and Pydantic enforces that type at instantiation time.

python
from pydantic import BaseModel class User(BaseModel): name: str email: str age: int user = User(name="Alice", email="alice@example.com", age=30) print(user.name) # Alice print(user.age) # 30

When you pass invalid data — a string for an integer field, for example — Pydantic raises a ValidationError instead of silently accepting bad data. This is the core insight: validation happens automatically, you never need to write isinstance() checks.

python
try: bad_user = User(name="Bob", email="bob@example.com", age="not a number") except ValidationError as e: print(e.errors()) # List of validation failures

Constraints

  • Use type annotations to define the fields.
  • Do not use any custom validators — let Pydantic handle type enforcement automatically.
  • The function must return a User instance, not a dictionary.
Practice Lesson 1

2.Types and Automatic Validation

Pydantic's type annotations are not just hints — they are enforcement rules. When you define a field with type str, int, or bool, Pydantic checks that the value matches the type and rejects it if it doesn't.

python
from pydantic import BaseModel, EmailStr class Account(BaseModel): username: str age: int is_active: bool account = Account(username="alice", age=25, is_active=True)

Pydantic goes beyond simple type checking. It coerces values intelligently: a string "42" becomes the integer 42, and "true" becomes the boolean True. However, you can set strict=True on a field to reject coercion and require exact types. Types like list, dict, and datetime objects are all validated automatically based on their shape and structure.

Constraints

  • Use basic type annotations: str, int, bool.
  • The function must accept three arguments and return a Person instance.
  • Pydantic will automatically coerce compatible types (e.g., string "25" → integer 25).
Practice Lesson 2

3.Required Fields vs Defaults

By default, every field in a Pydantic model is required — you must provide a value when instantiating the model. To make a field optional, use Optional[type] or provide a default value using Field() or assignment syntax.

python
from pydantic import BaseModel, Field from typing import Optional class Config(BaseModel): database_url: str max_connections: int = 10 debug_mode: Optional[bool] = None config = Config(database_url="postgres://localhost") print(config.max_connections) # 10 print(config.debug_mode) # None

You can also use Field() to set a default and add metadata like a description or constraints. Field() gives you fine-grained control: set the default, add a validation constraint, provide a description for API docs, and control how the field is serialized. Optional fields with None as the default are a common pattern for nullable data.

Constraints

  • database_url must be a required string field with no default.
  • timeout_seconds and retry_count must have default values of 30 and 3 respectively.
  • The function should only accept database_url as a parameter.
Practice Lesson 3

4.Field Metadata and Validation Messages

The Field() function lets you attach metadata to each field: a description, constraints (min/max), aliases for input, and serialization rules. This metadata is used by Pydantic's JSON schema generation and makes API documentation automatic.

python
from pydantic import BaseModel, Field class Product(BaseModel): name: str = Field(..., description="Product name") price: float = Field(gt=0, description="Price in USD (must be positive)") quantity: int = Field(ge=0, description="Stock quantity (non-negative)") product = Product(name="Laptop", price=999.99, quantity=10)

When validation fails, Pydantic includes a descriptive error message that mentions the constraint. You can also customize error messages using the errors.json_schema_extra option or by writing custom validators. Field constraints like gt (greater than), ge (greater than or equal), le (less than or equal), and max_length are enforced automatically. Aliases let you accept different field names in input JSON while mapping them to your Python names internally.

Constraints

  • Use Field() to add constraints: price must be > 0, stock must be >= 0.
  • Add descriptions to each field for API documentation.
  • The function must return a Product instance with valid data.
Practice Lesson 4

02Custom Validators and Error Handling

Write custom validators to enforce complex business rules, handle cross-field dependencies, and return meaningful error messages to API users.

1.Writing Field Validators

A field validator is a function decorated with @field_validator that runs whenever a field is assigned. It receives the field value and the validation info context, and can either return the value (optionally modified) or raise a validation error.

python
from pydantic import BaseModel, field_validator class Account(BaseModel): username: str password: str @field_validator('password') @classmethod def validate_password(cls, v): if len(v) < 8: raise ValueError('Password must be at least 8 characters') return v account = Account(username='alice', password='securepassword')

Fieldvalidators run in the order they are defined and have access to the validation info context (includes the raw input data and the validation mode). You can apply the same validator to multiple fields using a tuple of field names, or use mode='before' to validate the raw input before type coercion. Returning a modified value lets you normalize input — for example, converting a URL to lowercase or stripping whitespace.

Constraints

  • Use @field_validator decorator for the password field
  • Raise ValueError with a descriptive message if password is too short
  • Return the password value if validation passes
Practice Lesson 1

2.Model-Level Validation

A model validator is a function decorated with @model_validator that runs after all fields are assigned. It receives the entire model as a dictionary or instance and can enforce rules that span multiple fields.

python
from pydantic import BaseModel, model_validator class Address(BaseModel): street: str city: str zip_code: str country: str @model_validator(mode='after') def validate_address(self): if self.country == 'US' and len(self.zip_code) != 5: raise ValueError('US zip codes must be 5 digits') return self addr = Address(street='123 Main', city='NYC', zip_code='10001', country='US')

Model validators are essential for validating cross-field dependencies. Use mode='after' to validate the fully typed model, or mode='before' to validate raw input before type coercion. Unlike field validators, model validators see all fields at once and can make decisions based on combinations of values. This pattern is ideal for rules like "if country is US, zip code must follow US format."

Constraints

  • Use @model_validator with mode='after'
  • Check the country and zip_code together
  • Raise ValueError if the US zip code is not exactly 5 digits
  • Return self after validation passes
Practice Lesson 2

3.Custom Validation Errors

Instead of raising a generic ValueError, you can raise a PydanticCustomError to provide structured error information that Pydantic includes in the ValidationError. This gives you control over error codes, context, and the exact message users see.

python
from pydantic import BaseModel, field_validator, PydanticCustomError class SignUp(BaseModel): email: str age: int @field_validator('age') @classmethod def validate_age(cls, v): if v < 13: raise PydanticCustomError( 'age_too_young', 'Users must be at least 13 years old', ) return v signup = SignUp(email='user@example.com', age=10)

PydanticCustomError takes an error code (a slug like 'age_too_young'), a message template, and optional context dict. The error code is useful for i18n (internationalization) and for the client to show localized messages. API responses can include the error code and context so frontend applications can handle errors programmatically instead of relying only on strings.

Constraints

  • Use PydanticCustomError with an error code and message
  • The error code must be 'age_too_young'
  • The message must be 'Users must be at least 13 years old'
  • Return age if validation passes
Practice Lesson 3

4.Root Validators and Transformations

Field validators with mode='before' and model validators with mode='before' let you transform data before type coercion happens. This is useful for normalizing input — for example, converting a date string in multiple formats to a standard format, lowercasing email addresses, or stripping whitespace.

python
from pydantic import BaseModel, field_validator from datetime import date class Event(BaseModel): name: str date_str: str @field_validator('date_str', mode='before') @classmethod def parse_date(cls, v): if isinstance(v, str): # Try parsing ISO format first, then other formats try: return date.fromisoformat(v) except: return date(2025, 1, 1) return v event = Event(name='Launch', date_str='2025-03-04')

Transformations during validation are more efficient than post-instantiation cleanup and keep your model in a consistent state from the moment it exists. Use mode='before' to normalize raw input before Pydantic's type system touches it, and mode='after' to refine already-typed fields. The transformation pattern is essential for handling user input from APIs, forms, or files where formats vary.

Constraints

  • Use @field_validator with mode='before' to parse the input before type coercion
  • Accept date_input as a string in ISO format (YYYY-MM-DD)
  • Use date.fromisoformat() to parse the string
  • Return a date object
Practice Lesson 4

03Advanced Models and Serialization

Build complex data structures with nested models, control validation and serialization behavior with configuration, and generate JSON schemas for API documentation.

1.Nested Models and Relationships

A Pydantic model can include other Pydantic models as fields, creating a hierarchy. When you nest models, Pydantic validates the nested data according to the inner model's rules before constructing the outer model.

python
from pydantic import BaseModel class Address(BaseModel): street: str city: str zip_code: str class Customer(BaseModel): name: str email: str address: Address customer = Customer( name="Alice", email="alice@example.com", address={"street": "123 Main", "city": "NYC", "zip_code": "10001"} )

Pydantic automatically validates the nested data: if the address data is missing a field or has an invalid type, a ValidationError is raised before the Customer model is created. You can nest as deeply as you need, and even create circular relationships using forward references. Lists of models are also supported: items: list[Item] means Pydantic validates each Item in the list.

Constraints

  • Address and Customer are separate BaseModel classes
  • Customer.address is an Address model, not a dict or string
  • Pydantic validates the nested Address data automatically
Practice Lesson 1

2.Model Configuration for Different Contexts

ConfigDict lets you control Pydantic's behavior for a model: whether to validate on assignment, whether to freeze the model (make it immutable), JSON schema generation settings, and more.

python
from pydantic import BaseModel, ConfigDict class User(BaseModel): model_config = ConfigDict( frozen=True, validate_assignment=True, json_schema_extra={"examples": [{"name": "Alice", "age": 30}]} ) name: str age: int user = User(name="Alice", age=30) user.age = 31 # Raises ValidationError because frozen=True

Common ConfigDict settings: frozen=True makes the model immutable; validate_assignment=True validates changes to fields after instantiation; str_strip_whitespace=True removes leading/trailing spaces from strings; json_schema_extra adds custom metadata to the generated JSON schema. Use different models with different configs for different use cases — a database model might allow assignment, while an API response model might be frozen.

Constraints

  • Use ConfigDict with frozen=True
  • The function should return a User instance
  • Do not attempt to modify the user after creation
Practice Lesson 2

3.JSON Schema and Documentation

Every Pydantic model can export itself as a JSON schema, a standardized format that describes the structure of the model. Use model_json_schema() to generate the schema, which includes field types, descriptions, constraints, and examples.

python
from pydantic import BaseModel, Field import json class Product(BaseModel): name: str = Field(..., description="Product name") price: float = Field(gt=0, description="Price in USD") in_stock: bool = Field(default=True) schema = Product.model_json_schema() print(json.dumps(schema, indent=2))

JSON schema is the foundation for API documentation tools like Swagger/OpenAPI. FastAPI automatically generates interactive API docs from your Pydantic models using JSON schema. The schema includes all your Field descriptions, constraints, default values, and examples — everything needed for API users to understand the data contract. You can customize the schema using ConfigDict's json_schema_extra setting.

Constraints

  • Use Field() with descriptions for both fields
  • Use model_json_schema() to generate and return the schema
  • The return value should be a dictionary (the JSON schema object)
Practice Lesson 3

4.Serialization Modes and Custom Output

Serialization is converting a model instance to JSON or Python dict. Use model_dump() and model_dump_json() to serialize. You can include/exclude fields, convert enum values, and apply custom serializers.

python
from pydantic import BaseModel, field_serializer class User(BaseModel): name: str email: str password: str @field_serializer('password') def serialize_password(self, value): return '***' user = User(name="Alice", email="alice@example.com", password="secret") print(user.model_dump()) # {"name": "Alice", "email": "alice@example.com", "password": "***"}

Use model_dump(include={...}, exclude={...}) to control which fields appear in the output. The exclude_none=True flag removes None values. by_alias=True uses field aliases in the output. Custom field serializers with @field_serializer let you transform values during serialization — mask sensitive data, format dates, or compute derived fields. Different API responses often need different serializations of the same model.

Constraints

  • Use @field_serializer('password') to customize password serialization
  • The serializer should return '***' for the password
  • Return the result of model_dump()
Practice Lesson 4

04Real-World Integration

Integrate Pydantic with FastAPI, databases, and external APIs. Build production validation pipelines that handle real data from diverse sources.

1.Pydantic in FastAPI Routes

FastAPI is built on Pydantic. When you define a route with a Pydantic model as a parameter, FastAPI automatically validates the incoming JSON and deserializes it into the model. If validation fails, FastAPI returns a 422 Unprocessable Entity response with detailed error information.

python
from fastapi import FastAPI from pydantic import BaseModel, Field app = FastAPI() class User(BaseModel): name: str email: str = Field(..., pattern=r'.*@.*') age: int = Field(ge=18) @app.post('/users') def create_user(user: User): return {"message": f"User {user.name} created", "user": user}

FastAPI uses Pydantic models for both request bodies and response bodies. When you specify a model as a route parameter, FastAPI automatically generates OpenAPI/Swagger documentation from the model's JSON schema. Validation errors are caught before your route handler runs, keeping your code clean. You can also use Pydantic models for query parameters, path parameters, and headers.

Constraints

  • Use Field(ge=18) for age constraint
  • The function accepts a User model instance
  • Return a dict or the User instance
Practice Lesson 1

2.Database Models and Queries

Pydantic models can represent database rows. You often create separate models for database operations (with database IDs) and API responses (without internal fields). This separation keeps your API contract decoupled from your database schema.

python
from pydantic import BaseModel from datetime import datetime class UserDB(BaseModel): id: int name: str email: str created_at: datetime class UserCreate(BaseModel): name: str email: str class UserResponse(BaseModel): id: int name: str email: str

Create three models: UserCreate (accepts user input without IDs), UserDB (represents database rows with IDs and timestamps), and UserResponse (what the API returns). Validate user input against UserCreate before inserting into the database, then convert the database row to UserResponse for the API response. This pattern keeps validation logic separate from data access.

Constraints

  • UserCreate has no id or created_at
  • UserDB represents a database row with id and created_at
  • UserResponse omits created_at for API output
  • The function converts UserCreate to UserResponse
Practice Lesson 2

3.Handling API Responses from External Services

Third-party APIs don't always return exactly what you expect. Fields may be missing, have unexpected types, or contain extra data. Use Pydantic models with ConfigDict(extra='ignore') to handle missing fields gracefully and ignore unexpected ones.

python
from pydantic import BaseModel, ConfigDict, field_validator import requests class WeatherAPI(BaseModel): model_config = ConfigDict(extra='ignore') temp: float humidity: int location: str @field_validator('humidity', mode='before') @classmethod def coerce_humidity(cls, v): # API might return humidity as a string "85%" if isinstance(v, str): return int(v.rstrip('%')) return v response = requests.get('https://api.weather.example.com/current') weather = WeatherAPI(**response.json())

Use ConfigDict(extra='allow') to accept extra fields, extra='forbid' to reject them, or extra='ignore' (default) to silently drop them. Combine with field validators to normalize API responses: parse date strings, extract nested values, and handle alternative field names using aliases.

Constraints

  • Use ConfigDict(extra='ignore') to ignore unexpected fields
  • The function accepts a dict and returns a WeatherAPI instance
  • All three fields must be present for successful parsing
Practice Lesson 3

4.Building a Production Validation Pipeline

A production validation pipeline combines all Pydantic techniques: parsing raw input, validating with models, logging failures, retrying on error, and storing cleaned data. Structure your pipeline to be resilient and observable.

python
from pydantic import BaseModel, ValidationError from typing import Optional class DataRecord(BaseModel): id: str value: int timestamp: str def ingest_data(raw_records: list[dict]) -> dict: valid = [] invalid = [] for record in raw_records: try: validated = DataRecord(**record) valid.append(validated) except ValidationError as e: invalid.append({"record": record, "errors": e.errors()}) # Log the error for monitoring return {"valid": valid, "invalid": invalid, "success_rate": len(valid) / len(raw_records)}

Production pipelines catch ValidationError, log the offending data with detailed error info, and track metrics like success rate. Store both valid records and failures separately for analysis. Use Pydantic's batch validation capabilities, validate upstream (at API boundaries), and downstream (before database insert) to catch errors early.

Constraints

  • For each record, attempt validation using DataRecord
  • Catch ValidationError and collect invalid records
  • Return a dict with 'valid' and 'invalid' lists
Practice Lesson 4

Frequently Asked Questions

What is the primary purpose of Pydantic BaseModel?
To automatically validate data types when creating instances. BaseModel validates data automatically against type annotations, catching errors before they propagate into your business logic.
How do you make a field optional in Pydantic?
Use Optional[type] or provide a default value. Pydantic follows Python's typing conventions: Optional[type] (equivalent to Union[type, None]) or providing a default value makes a field optional.
What does the Field() function allow you to do?
Add metadata like descriptions, constraints (gt, ge, lt, le), and aliases. Field() provides a way to add metadata to fields, including constraints (gt=0 for 'greater than'), descriptions for API documentation, and serialization aliases.
What is the constraint `gt=0` in Field()?
Greater than 0 (does not include 0). `gt` stands for 'greater than' and enforces that the value is strictly greater than the specified number. Use `ge` for 'greater than or equal to'.
When you instantiate a Pydantic model with an invalid type, what happens?
A ValidationError is raised (unless coercion is possible). Pydantic raises a ValidationError when the data does not match the type annotation. However, it attempts to coerce compatible types (e.g., string '42' to integer 42) before raising an error.
How do I create a User BaseModel class with fields name (str), email (str), and age (int). Write a function `create_user()` that returns a User instance.?
A BaseModel is a class that automatically validates data when you create an instance from it. Every field is defined with a Python type annotation, and Pydantic enforces that type at instantiation time.
How do I create a Person BaseModel with fields name (str), age (int), and is_admin (bool). Write a function `create_person(name, age, is_admin)` that returns a Person instance with the given values.?
Pydantic's type annotations are not just hints — they are enforcement rules. When you define a field with type `str`, `int`, or `bool`, Pydantic checks that the value matches the type and rejects it if it doesn't.
How do I create a Config BaseModel where `database_url` is required (str), `timeout_seconds` has a default of 30 (int), and `retry_count` has a default of 3 (int). Write a function `create_config(database_url)` that returns a Config instance.?
By default, every field in a Pydantic model is required — you must provide a value when instantiating the model. To make a field optional, use `Optional[type]` or provide a default value using `Field()` or assignment syntax.
How do I create a Product BaseModel using the `Field()` function with name (str, required), price (float, must be positive using `gt=0`), and stock (int, non-negative using `ge=0`). Write a function `create_product(name, price, stock)` that returns a Product instance.?
The `Field()` function lets you attach metadata to each field: a description, constraints (min/max), aliases for input, and serialization rules. This metadata is used by Pydantic's JSON schema generation and makes API documentation automatic.
What decorator is used to validate individual fields in Pydantic?
@field_validator. @field_validator is the decorator used to write custom validation logic for individual fields.
When should you use @model_validator instead of @field_validator?
When you need to validate multiple fields together or enforce cross-field constraints. @model_validator runs after all fields are assigned and is ideal for validating relationships between fields.
What does mode='before' do in a field validator?
Validates the raw input before Pydantic's type coercion. mode='before' lets you validate and transform raw input before type coercion happens. Useful for parsing dates or normalizing strings.
What is the advantage of raising PydanticCustomError instead of ValueError?
PydanticCustomError provides structured error codes and context for API clients. PydanticCustomError lets you specify an error code and context, enabling clients to handle errors programmatically and show localized messages.
How do validators enforce business rules in Pydantic?
By raising an exception during instantiation if validation fails. Validators raise an exception if validation fails, preventing invalid instances from being created. This ensures your model is always in a valid state.
How do I create an Account BaseModel with username (str) and password (str) fields. Write a `@field_validator` for password that ensures it is at least 8 characters long.?
A field validator is a function decorated with `@field_validator` that runs whenever a field is assigned. It receives the field value and the validation info context, and can either return the value (optionally modified) or raise a validation error.
How do I create an Address BaseModel with street, city, zip_code, and country fields. Write a `@model_validator` that enforces: if country is 'US', the zip_code must be exactly 5 digits.?
A model validator is a function decorated with `@model_validator` that runs after all fields are assigned. It receives the entire model as a dictionary or instance and can enforce rules that span multiple fields.
How do I create a SignUp BaseModel with email (str) and age (int) fields. Write a `@field_validator` for age that raises `PydanticCustomError` if age is less than 13, with error code 'age_too_young' and message 'Users must be at least 13 years old'.?
Instead of raising a generic ValueError, you can raise a `PydanticCustomError` to provide structured error information that Pydantic includes in the ValidationError. This gives you control over error codes, context, and the exact message users see.
How do I create an Event BaseModel with name (str) and date_input (str). Write a `@field_validator` with `mode='before'` that accepts date_input as a string, parses it into a date object (using a simple format or returning a default date), and returns the date.?
Field validators with `mode='before'` and model validators with `mode='before'` let you transform data before type coercion happens. This is useful for normalizing input — for example, converting a date string in multiple formats to a standard format, lowercasing email addresses, or stripping whitespace.
What is a nested model in Pydantic?
A model that contains other models as fields. A nested model is one that has fields defined as other Pydantic models. Pydantic validates each nested structure according to its own rules.
What does ConfigDict control in a Pydantic model?
The model's validation behavior, immutability, and JSON schema generation. ConfigDict is a configuration class that controls model behavior: frozen=True, validate_assignment=True, json_schema_extra, and other settings.
What is the purpose of model_json_schema()?
To generate a JSON Schema from the model definition for documentation and validation. model_json_schema() generates a JSON Schema that describes the model's structure. This is used for API documentation, client-side validation, and code generation.
What does @field_serializer do?
Transforms a field's value during serialization (when converting to JSON/dict). @field_serializer lets you customize how a field is output during serialization, useful for masking sensitive data or formatting values.
When should you use frozen=True in ConfigDict?
To make a model immutable after instantiation. frozen=True makes a model immutable. Attempting to modify fields after creation raises a ValidationError, useful for API response models that shouldn't change.
How do I create an Address BaseModel with street, city, and zip_code fields. Create a Customer BaseModel with name, email, and address (Address model) fields. Write a function that returns a Customer instance.?
A Pydantic model can include other Pydantic models as fields, creating a hierarchy. When you nest models, Pydantic validates the nested data according to the inner model's rules before constructing the outer model.
How do I create a User BaseModel with model_config setting frozen=True. Add name (str) and age (int) fields. Write a function that returns a User instance.?
ConfigDict lets you control Pydantic's behavior for a model: whether to validate on assignment, whether to freeze the model (make it immutable), JSON schema generation settings, and more.
How do I create a Product BaseModel with name (str) and price (float, must be > 0) fields. Write a function that returns the JSON schema of the Product model as a dict.?
Every Pydantic model can export itself as a JSON schema, a standardized format that describes the structure of the model. Use `model_json_schema()` to generate the schema, which includes field types, descriptions, constraints, and examples.
How do I create a User BaseModel with name, email, and password fields. Use @field_serializer to hide the password during serialization (return '***'). Write a function that returns user.model_dump().?
Serialization is converting a model instance to JSON or Python dict. Use `model_dump()` and `model_dump_json()` to serialize. You can include/exclude fields, convert enum values, and apply custom serializers.
How does FastAPI use Pydantic models?
To automatically validate request bodies and generate OpenAPI documentation. FastAPI integrates tightly with Pydantic: models validate request/response data and automatically generate interactive API docs.
Why create separate Pydantic models for Create, DB, and Response?
To keep API contracts decoupled from database schemas and prevent exposing sensitive fields. Separate models (UserCreate, UserDB, UserResponse) provide flexibility: accept input without IDs, store with timestamps, return only public fields.
What does ConfigDict(extra='ignore') do when parsing third-party API data?
Silently drops unexpected fields and only validates expected ones. extra='ignore' makes your code resilient to API changes by accepting and discarding unexpected fields instead of failing.
How should a production validation pipeline handle invalid records?
Catch ValidationError, log failures with error details, and continue processing. Production pipelines must be resilient: catch errors, log them for monitoring and debugging, separate valid/invalid results, and continue processing.
What is the main advantage of integrating Pydantic with FastAPI and databases?
Validation happens once at the API boundary, and the same model definition drives docs and data contracts. With Pydantic + FastAPI, models are the single source of truth: they validate input, generate docs, and enforce contracts across your system.
How do I create a User Pydantic model with name (str), email (str), and age (int, >= 18) fields. Write a function that accepts a User instance and returns a dict with the user's data.?
FastAPI is built on Pydantic. When you define a route with a Pydantic model as a parameter, FastAPI automatically validates the incoming JSON and deserializes it into the model. If validation fails, FastAPI returns a 422 Unprocessable Entity response with detailed error information.
How do I create three models: UserCreate (name, email), UserDB (id, name, email, created_at as datetime), and UserResponse (id, name, email). Write a function that accepts a UserCreate and returns a UserResponse.?
Pydantic models can represent database rows. You often create separate models for database operations (with database IDs) and API responses (without internal fields). This separation keeps your API contract decoupled from your database schema.
How do I create a WeatherAPI model that accepts temp (float), humidity (int), and location (str). Use ConfigDict(extra='ignore') to handle extra API fields. Write a function that accepts a dict (simulating API response) and returns a WeatherAPI instance.?
Third-party APIs don't always return exactly what you expect. Fields may be missing, have unexpected types, or contain extra data. Use Pydantic models with `ConfigDict(extra='ignore')` to handle missing fields gracefully and ignore unexpected ones.
How do I create a DataRecord model with id (str), value (int), and timestamp (str) fields. Write a function that accepts a list of dicts (raw records), validates each one, and returns a dict with valid records and invalid records.?
A production validation pipeline combines all Pydantic techniques: parsing raw input, validating with models, logging failures, retrying on error, and storing cleaned data. Structure your pipeline to be resilient and observable.

Ready to write code?

Theory is just the start. Write real code, run tests, build the habit.

Open the playground →