As explained in Chapter 3, “Starting Out,” the data model of a Web application is usually its foundation, and at any rate is an excellent place to begin exploring the details of Django development. Although this chapter has two main sections—defining models, and then using them—the two halves are more intertwined than separate. We need to consider how we plan to use our models, while we’re defining them, to generate the most effective arrangement of classes and relationships. And, of course, you can’t make the best use of a model without understanding the how and the why of its definition.
Django’s database model layer makes heavy use of an ORM (Object-Relational Mapper), and it’s a good idea to understand the reasoning behind this design decision as well as the pluses and minuses of the approach. Therefore, we start out this section with an explanation of Django’s ORM, after which we get into the details of model fields, the possible relationships between model classes, and the use of model class metadata to define specific behaviors of the model or enable and customize the Django admin application.
Django, along with most other modern Web frameworks (as well as many other application development tools), relies on a rich data access layer that attempts to bridge an underlying relational database with Python’s object-oriented nature. These ORMs are still a subject of much debate in the development community with various arguments for and against their use. As Django was designed with the use of an ORM in mind, we present to you four arguments in favor of using them, specifically Django’s own implementation.
Django model objects, as we cover later in this chapter, are first and foremost a way of defining a collection of fields, which generally map to database columns. This provides the first and primary step in tying the relational database to object-oriented concepts. Instead of a SQL query like SELECT name FROM authors WHERE id=5
, one can request the Author
object whose id
is 5 and examine author.name
—this is a much more Pythonic type of interface to the data.
However, model objects can add a lot of extra value to that humble beginning. Django’s ORM, like many others, enables you to define arbitrary instance methods, leading to any number of useful things. For example:
You can define read-only combinations of fields or attributes, sometimes known as data aggregation or calculated attributes. For example, an Order
object with count
and cost
attributes could expose a total
that is simply the product of the other two. Common object-oriented design patterns become much easier—façades, delegation, and so forth.
In Django, the ORM presents the option of overriding built-in database-altering methods such as saving and deleting objects. This enables you to easily define a set of arbitrary operations to be performed on your data before it is saved to the database or to ensure that certain clean-up operations are always called prior to deleting a record, no matter where or how the deletion occurs.
Integration with the programming language—in Django’s case, Python—is generally simple, enabling you to let your database objects conform to specific interfaces or APIs.
Due to their very nature—being a layer of code between your application and the database itself—ORMs provide excellent portability. Most ORM platforms support multiple database backends, and Django’s is no exception. At the time of this writing, code utilizing Django’s model layer runs on PostgreSQL, MySQL, SQLite, and Oracle—and this list is likely to grow as more database backend plugins are written.
Because you are rarely executing your own SQL queries when using an ORM, you don’t have to worry as much about the issues caused by malformed or poorly protected query strings, which often lead to problems such as SQL injection attacks. ORMs also provide a central mechanism for intelligent quoting and escaping of input variables, freeing up time otherwise spent dealing with that sort of minutia. This sort of benefit is common with modularized or layered software of which MVC frameworks are a good example. When all the code responsible for a specific problem domain is well-organized and self-contained, it can often be a huge time-saver and increase overall safety.
Although not directly related to the definition of models, one of the greatest benefits of using an ORM (and certainly one of the largest differences, compared to writing raw SQL) is the query syntax used to obtain records from the database. Not only is a higher-level query syntax arguably easier to work with, but the act of bringing the query mechanisms into the realm of Python enables a host of useful tactics and methodologies. For example, it becomes possible to construct otherwise unwieldy queries by looping over data structures, an approach that is generally more compact than the equivalent SQL and can avoid the sometimes tedious string manipulation that can be otherwise required.
Django models cover a wide range of field types; some of them are closely tied to their database implementations, although others have been designed with Web form interfaces in mind. Most of them fall between these two extremes. Although an exhaustive list can be found in the official Django documentation, we present a comparison study which covers some of the most commonly used fields. First, we provide a quick introduction to the basics of Django model definition.
from django.db import models class Book(models.Model): title = models.CharField(max_length=100) author = models.ForeignKey(Author) length = models.IntegerField()
From the previous, it should be relatively obvious what we’ve just created: a simplistic model of a book made of up various database-related concepts. It’s not much to look at—generally those tasked with cataloging books are interested in much more than just the title, author, and number of pages—but it’ll do. It’s also perfectly workable. You could throw that example into a Django models.py
file and be well on your way to a book catalog app with very few modifications.
As you can see, Django uses Python classes to represent objects, which generally map to SQL tables with attributes mapping to columns. These attributes are themselves objects, specifically subclasses of a Field
parent class; as stated previously, some of them are obvious analogues to SQL column types, although others provide some level of abstraction. Let’s examine some specific Field
subclasses.
CharField
and TextField
: Possibly the most common fields you encounter, these two do much the same thing—they hold text. CharField
s have a set, finite length, although TextField
s are essentially infinite; which one you use depends on your needs, including the fulltext search capabilities of your database or your need for efficient storage.
EmailField
, URLField
, and IPAddressField
: These three fields, among others, are essentially CharFields
which provide extra validation. Such fields are stored in the database, like a CharField
but have validation code defined to ensure their values conform to e-mail addresses, URLs, and IP addresses, respectively. It’s simple to add your own validation to model fields and thus to create your own “field types” on the same level as Django’s built-in ones. (See Chapters 6, “Templates and Form Processing,” and 7, “Photo Gallery,” for more on validation.)
BooleanField
and NullBooleanField
: BooleanField
works in most situations where you want to store True or False values, but sometimes you want the capability to store the fact you don’t know yet if the value is one or the other—in which case the field would be considered empty, or null, and thus NullBooleanField
was born. This distinction highlights the fact that modeling your data often requires some thought, and decisions sometimes need to be made on a semantic level as well as a technical one—not just how the data is stored, but what it means.
FileField
: FileField
is one of the most complex fields, in no small part because almost all the work involved in its use isn’t in the database at all, but in the request part of the framework. FileField
stores only a file path in the database, similar to its lesser cousin FilePathField
, but goes the extra mile and provides the capability to upload a file from the user’s browser and store it somewhere on the server. It also provides methods on its model object for accessing a Web-based URL for the uploaded file.
These are only a handful of the available field types present in Django model definitions, and as new Django releases come out, new fields are occasionally added or updated. To see the full, up-to-date list of model field classes and what you can do with them, see the official Django documentation. You also see many of these fields throughout this book in example code snippets and example applications in Part III, “Django Applications by Example.”
A common concept in relational database definition is that of a primary key, which is a field guaranteed to be unique across an entire table (or in Django ORM terms across an entire model). These primary keys are typically auto-incrementing integers because such a field is a simple and effective method of ensuring that each row in the table has a unique value.
They’re also useful as reference points for relationships between models (which are covered in the next few sections)—if a given Book
object has an ID number of 5 and is guaranteed to be the only Book
with that ID number, a reference to “book #5” is unambiguous.
Because this type of primary key is fairly ubiquitous, Django automatically makes one for you unless you specify otherwise. All models without an explicit primary key field are given an id
attribute, which is a Django AutoField
(an auto-incrementing integer). AutoField
s behave just as normal integers, and their underlying database column type varies depending on your database backend.
For those wanting more control over primary keys, simply make sure you specify primary_key=True
for one of your model fields, and that field becomes the primary key for the table in place of id
(which is omitted in such circumstances). This means the field’s values must be completely unique, so specifying it for a string field such as a name or other identifier cannot be a good idea unless you’re 110 percent certain you never, ever have duplicates!
Speaking of duplicates, we’ll also mention there’s a similar argument that can be applied to just about any field in your model: unique=True
. This enforces uniqueness for the field in question without making that field the primary key.
The capability to define relationships between model objects is, naturally, one of the strongest selling points for using relational databases (as evidenced by the name itself—relational) and is also an area where ORMs sometimes tend to differ from one another. Django’s current implementation is fairly database-centric, making sure the relations are defined at the database level and not just at the application level. However, because SQL only provides for one explicit form of relation—the foreign key—it is necessary to add some layering to provide more complex relationships. We examine the foreign key first and then move to how it can serve as a building block for the other relationship types.
Because foreign keys are fairly simple, Django’s implementation of them is similarly straightforward. They’re represented as their own Field
subclass, ForeignKey
, whose primary argument is simply the model class being referred to, as in the following example:
class Author(models.Model): name = models.CharField(max_length=100) class Book(models.Model): title = models.CharField(max_length=100) author = models.ForeignKey(Author)
You should note we need to define classes being referred to at the top because otherwise the Author
variable name would not be available for use in the Book
class’s ForeignKey
field. However, you can use a string instead, either the class name if it’s defined in the same file, or using dot notation (for example, 'myapp.Author'
) otherwise. Here’s the previous example rearranged and rewritten using a string-based ForeignKey
:
class Book(models.Model): title = models.CharField(max_length=100) author = models.ForeignKey("Author") class Author(models.Model): name = models.CharField(max_length=100)
It’s also possible to define self-referential ForeignKey
s by using the string 'self.'
. This is commonly used when defining hierarchical structures (for example, a Container
class defining a parent
attribute enabling nested Container
s) or similar situations (such as an Employee
class with attributes such as supervisor
or hired_by
).
Although the ForeignKey
is only defined on one side of the relationship, the receiving end is able to follow the relationship backward. Foreign keys are technically a many-to-one relationship, as multiple “child” objects can refer to the same “parent” object; thus, the child gets a single reference to its parent, but the parent gets access to a set of its children. Using the previous example, you could use Book
and Author
instances such as:
# Pull a book off the shelf - see below in this chapter for details on querying book = Book.objects.get(title="Moby Dick") # Get the book's author - very simple author = Book.author # Get a set of the books the author has been credited on books = author.book_set.all()
As you can see, the “reverse relationship” from Author
to Book
is represented by the Author.book_set
attribute (a manager object, outlined later in the chapter), which is automatically added by the ORM. It’s possible to override this naming scheme by specifying a related_name
argument to the ForeignKey
; in the previous example, we could have defined author
as ForeignKey("Author", related_name="books")
and would then have access to author.books
instead of author.book_set
.
The use of related_name
is optional for simple object hierarchies, but required for more complex ones, such as when you have multiple ForeignKey
s leading from one object to another. In such situations, the ORM needs you to tell it how to differentiate the two reverse relationship managers on the receiving end of those two ForeignKey
fields. Django’s database management tools lets you know by way of an error message if you forget!
Foreign keys are generally used to define one-to-many (or many-to-one) relationships—in our previous examples, a Book
has a single Author
and an Author
can have many Books
. However, sometimes you need more flexibility. For example, until now we’ve assumed a Book
has only one Author
, but what about books written by more than one person, such as this one?
Such a scenario requires a “many” relationship not only on one side (Author
having one or more Book
s) but on both (Book
also having one or more Author
s). This is where the concept of many-to-many relationships come in; because SQL has no definition for these, we must build them using the foreign keys it does understand.
Django provides a second relationship-oriented model field to handle this situation: ManyToManyField
. Syntax-wise, they are identical to ForeignKey
; you define them on one side of the relationship, passing in the class to relate to, and the ORM automatically grants the other side the necessary methods or attributes to use the relationship (typically by creating a _set
manager as seen previously with ForeignKey
s). However, due to the nature of ManyToManyField
, it doesn’t generally matter which side you define it on because the relationship is inherently symmetrical.
If you plan on using Django’s admin application, keep in mind the admin forms for objects in a many-to-many relationship only display a form field on the defining side.
Self-referential ManyToManyField
s (that is, a ManyToManyField
on a given model referencing that same model) are symmetrical by default because it’s assumed the relationship goes both ways. However, this is not always the case, and so it’s possible to change this behavior by specifying symmetrical=False
in the field definition.
Let’s update our book example with the newfound realization we must handle multiple-author books:
class Author(models.Model): name = models.CharField(max_length=100) class Book(models.Model): title = models.CharField(max_length=100) authors = models.ManyToManyField(Author)
The usage of ManyToManyField
is similar to the “many” side of a foreign key relationship:
# Pull a book off the shelf book = Book.objects.get(title="Python Web Development Django") # Get the books' authors authors = Book.author_set.all() # Get all the books the third author has worked on books = authors[2].book_set.all()
The secret of the ManyToManyField
is that underneath, it creates an entirely new table in order to provide the lookups needed for such a relationship, and it is this table which uses the foreign key aspects of SQL; each row represents a single relationship between two objects, containing foreign keys to both.
This lookup table is normally hidden during regular use of Django’s ORM and cannot be queried on its own, only via one of the ends of the relationship. However, it’s possible to specify a special option on a ManyToManyField
, through
, which points to an explicit intermediate model class. Use of through
thus lets you manually manage extra fields on the intermediate class, while retaining the convenience of managers on the “ends” of the relationship.
The following is identical to our previous ManyToManyField
example, but contains an explicit Authoring
intermediate table, which adds a collaboration_type
field to the relationship, and the through
keyword pointing to it.
class Author(models.Model): name = models.CharField(max_length=100) class Book(models.Model): title = models.CharField(max_length=100) authors = models.ManyToManyField(Author, through="Authoring") class Authoring(models.Model): collaboration_type = models.CharField(max_length=100) book = models.ForeignKey(Book) author = models.ForeignKey(Author)
You can query Author
and Book
in an identical fashion to our earlier query example and can also construct queries dealing with the type of “authoring” that was involved.
# Get all essay compilation books involving Chun chun_essay_compilations = Book.objects.filter( author__name__endswith='Chun', authoring__collaboration_type='essays' )
As you can see, this adds significant flexibility to Django’s ability to compose relationships meaningfully.
In addition to the commonly used many-to-one and many-to-many relationship types you’ve just seen, relational database development sometimes makes use of a third type, namely one-to-one relationships. As with the other two, the name means exactly what it says; both sides of the relationship have only a single-related object.
Django implements this concept as as a OneToOneField
that is generally identical to ForeignKey
—it requires a single argument, the class to relate to (or the string “self” to be self-referential). Also like ForeignKey
, it optionally takes related_name
so you can differentiate between multiple such relationships between the same two classes. Unlike its cousins, OneToOneField
does not add a reverse manager for following the reverse relationship—just another normal attribute—because there’s always only one object in either direction.
This relationship type is most often used to support object composition or ownership, and so is generally a bit less rooted in the real world than it is in object-oriented design. Before Django supported model inheritance directly (see the following), OneToOneField
was typically used to implement inheritance-like relationships and now forms the behind-the-scenes basis for that feature.
As a final note regarding the definition of relationships, it’s possible—for both ForeignKey
s and ManyToManyField
s—to specify a limit_choices_to
argument. This argument takes a dictionary as its value, whose key/value pairs are query keywords and values (again, see the following for details on what those keywords are). This is a powerful method for specifying the possible values of the relationship you’re defining.
For example, the following is a version of the Book
model class that only works with Author
s whose name ends in Smith:
class Author(models.Model): name = models.CharField(max_length=100) class SmithBook(models.Model): title = models.CharField(max_length=100) authors = models.ManyToManyField(Author, limit_choices_to={ 'name__endswith': 'Smith' })
It’s also possible—and sometimes desirable—to specify this limitation at the form level. See the description of ModelChoiceField
and ModelMultipleChoiceField
in Chapter 6.
A relatively new feature in Django’s ORM at the time of this writing is that of model inheritance. In addition to foreign key and other relationships between otherwise distinct model classes, it’s possible to define models which inherit from one another in the same way that normal, non-ORM Python classes do. (Some examples of which can be found in Chapter 1, “Practical Python for Django.”)
For example, the previous SmithBook
class could be defined not as its own stand-alone class that just happens to have the same two fields as the Book
class, but as an explicit subclass of Book
. The benefits are hopefully obvious—the subclass can then add or override only the fields that differentiate it from its parent, instead of replicating the entire definition of the other class.
Our simplistic Book
example doesn’t make this sound too exciting, but consider a more realistic model with a dozen or more attributes and a handful of complex methods, and suddenly inheritance becomes a compelling way to adhere to Don’t Repeat Yourself (DRY). Do note, however, that composition—the use of ForeignKey
or OneToOneField
—is still a viable alternative! Which technique you use is entirely up to you and depends a lot on your planned model setup.
Django currently provides two different approaches to inheritance, each with its own pluses and minuses: abstract base classes and multi-table inheritance.
The approach of using abstract base classes is, to put it simply, “Python-only” inheritance—it enables you to refactor your Python model definitions such that common fields and methods are inherited from base classes. However, at a database and query level, the base classes don’t exist, and their fields are replicated in the database tables for the children.
This sounds like a violation of DRY, but is actually desirable in scenarios where you don’t want an extra database table for the base class—such as when your underlying database is legacy or otherwise being used by another application. It’s also just a neater way to express refactoring of class definitions without implying an actual object hierarchy.
Let’s re-examine (and flesh out) the Book
and SmithBook
model hierarchy, using abstract base classes.
class Author(models.Model): name = models.CharField(max_length=100) class Book(models.Model): title = models.CharField(max_length=100) genre = models.CharField(max_length=100) num_pages = models.IntegerField() authors = models.ManyToManyField(Author) def __unicode__(self): return self.title class Meta: abstract = True class SmithBook(Book): authors = models.ManyToManyField(Author, limit_choices_to={ 'name__endswith': 'Smith' })
The key is the abstract = True
setting in the Meta
inner class of Book
—it signifies that Book
is an abstract base class and only exists to provide its attributes to the actual model classes which subclass it. Note SmithBook
only redefines the authors
field to provide its limit_choices_to
option—because it inherits from Book
instead of the usual models.Model
, the resulting database layout has columns for title
, genre
, and num_pages
, as well as a many-to-many lookup table for authors
. The Python-level class also has a __unicode__
method defined as returning the title
field, just as Book
does.
In other words, when created in the database, as well as when utilized for object creation, ORM querying, and so forth, SmithBook
behaves exactly as if it were the following definition:
class SmithBook(models.Model): title = models.CharField(max_length=100) genre = models.CharField(max_length=100) num_pages = models.IntegerField() authors = models.ManyToManyField(Author, limit_choices_to={ 'name__endswith': 'Smith' }) def __unicode__(self): return self.title
As mentioned, this behavior extends to the query mechanism as well as the attributes of SmithBook
instances, so the following query would be completely valid:
smith_fiction_books = SmithBook.objects.filter(genre='Fiction')
Our example isn’t fully suited to abstract base classes, however, you’d typically want to create both normal Book
s as well as SmithBook
s. Abstract base classes are, of course, abstract—they cannot be created on their own, and as stated previously, are mostly useful to provide DRY at the model definition level. Multi-table inheritance, outlined next, is a better approach for our particular scenario.
Some final notes regarding abstract base classes: The inner Meta
class on subclasses is inherited from, or combined with, that of the parent class (with the natural exception of the abstract
option itself, which is reset to False
, as well as some database-specific options such as db_name
).
In addition, if a base class uses the related_name
argument to a relational field such as ForeignKey
, you need to use some string formatting, so subclasses don’t end up clashing. Don’t use a normal string, such as "related_employees"
, but one with %(class)s
in it, such as "related_%(class)s"
(refer back to Chapter 1 if you don’t recall the details about this type of string replacement). This way, the subclass name is substituted correctly, and collisions are avoided.
Multi-table inheritance, at the definition level, appears to be only slightly different from abstract base classes. The use of Python class inheritance is still there, but one simply omits the abstract = True Meta
class option. When examining model instances, or when querying, multi-table inheritance is again the same as what we’ve seen before; a subclass appears to inherit all the attributes and methods of its parent class (with the exception of the Meta
class, as we explain in just a moment).
The primary difference is the underlying mechanism. Parent classes in this scenario are full-fledged Django models with their own database tables and can be instantiated normally as well as lending their attributes to subclasses. This is accomplished by automatically setting up a OneToOneField
between the subclasses and the parent class, and then performing a bit of behind-the-scenes magic to tie the two objects together, so the subclass inherits the parent class’s attributes.
In other words, multi-table inheritance is just a convenience wrapper around a normal “has-a” relationship—or what’s known as object composition. Because Django tries to be Pythonic, the “hidden” relationship is actually exposed explicitly if you need it, via the OneToOneField
, which is given the lowercased name of the parent class with a _ptr
suffix. For example, in the snippet that follows, SmithBook
gets a book_ptr
attribute leading to its “parent” Book
instance.
The following is our Book
and SmithBook
example with multi-table inheritance:
class Author(models.Model): name = models.CharField(max_length=100) class Book(models.Model): title = models.CharField(max_length=100) genre = models.CharField(max_length=100) num_pages = models.IntegerField() authors = models.ManyToManyField(Author) def __unicode__(self): return self.title class SmithBook(Book): authors = models.ManyToManyField(Author, limit_choices_to={ 'name__endswith': 'Smith' })
As mentioned, the only difference at this point is the lack of the Meta
class abstract
option. Running manage.py syncdb
on an empty database with this models.py
file would create three main tables—one each for Author
, Book
, and SmithBook
—whereas with abstract base classes we’d only have tables for Author
and SmithBook
.
Note SmithBook
instances get a book_ptr
attribute leading back to their composed Book
instance, and Book
instances that belong to (or that are part of, depending on how you look at it) SmithBook
s get a smithbook
(without a _ptr
suffix) attribute.
Because this form of inheritance enables the parent class to have its own instances, Meta
inheritance could cause problems or conflicts between the two sides of the relationship. Therefore, you need to redefine most Meta
options that can otherwise have been shared between both classes (although ordering
and get_latest_by
is inherited if not defined on the child). This makes honoring DRY a little bit tougher, but as much as we’d like to achieve 100 percent DRY, it’s not always possible.
Finally, we hope it’s relatively clear why this approach is better for our book model; we can instantiate both normal Book
objects as well as SmithBook
objects. If you’re using model inheritance to map out real-world relationships, chances are you prefer multi-table inheritance instead of abstract base classes. Knowing which approach to use—and when to use neither of them—is a skill that comes with experience.
The fields and relationships you define in your models provide the database layout and the variable names you use when querying your model later on—you often find yourself adding model methods such as __unicode__
and get_absolute_url
or overriding the built-in save
or delete
methods. However, there’s a third aspect of model definition and that’s the inner class used to inform Django of various metadata concerning the model in question: the Meta
class.
The Meta
class, as the name implies, deals with metadata surrounding the model and its use or display: how its name should be displayed when referring to a single object versus multiple objects, what the default sort order should be when querying the database table, the name of that database table (if you have strong opinions on the subject), and so forth. In addition, the Meta
class is where you define multi-field uniqueness constraints because it wouldn’t make sense to define those inside any single field declaration. Let’s add some metadata to our first Book
example from earlier.
class Book(models.Model): title = models.CharField(max_length=100) authors = models.ManyToManyField(Author) class Meta: # Alphabetical order ordering = ['title']
That’s it! The Book
class is so simple it doesn’t need to define most of the options the Meta
inner class provides, and if we didn’t really care about a default ordering, it could have been left out entirely. Meta
and Admin
are entirely optional, albeit commonly used, aspects of model definition. Let’s whip up a more complex example because Book
’s meta options are fairly boring.
class Person(models.Model): first = models.CharField(max_length=100) last = models.CharField(max_length=100) middle = models.CharField(max_length=100, blank=True) class Meta: # The proper way to order people, assuming a Last, First M. style of # display. ordering = ['last', 'first', 'middle'] # Here we encode the fact that we can't have a person with a 100% # identical name. Of course, in real life, we could, but we'll pretend # this is an ideal world. unique_together = ['first', 'last', 'middle'] # Django's default pluralization is simply to add 's' to the end: that # doesn't work here. verbose_name_plural = "people"
As you can see from the comments, modeling the concept of a person would be rough going without defining some Meta
options. We have to consider all three fields when ordering records, and to avoid duplication, and having the system refer to more than one person as “persons” can be quaint, but is probably not desired.
For more details on the various Meta
class options you can define, we defer you to the official Django documentation.
If you’re using the “admin” contrib app that comes with Django, you are making heavy use of admin site
objects and their register
function, as well as optional ModelAdmin
subclasses. These subclasses enable you to define various options concerning how your model is utilized when you’re interacting with it in the admin application.
Simply registering your model class with the admin (along with enabling the Admin app itself, covered in Chapter 2, “Django for the Impatient: Building a Blog”) is enough to get the admin to pick it up and provide you with basic list and form pages; hooking in a ModelAdmin
subclass with extra options enables you to hand-pick the fields displayed in list views, the layout of the forms, and more.
In addition, you can specify inline editing options for relational fields such as ForeignKey
, by creating Inline
subclasses and referencing them in a ModelAdmin
subclass. This proliferation of extra classes can seem odd at first, but it’s an extremely flexible way of ensuring any given model can be represented in more than one way or in multiple admin sites. Extending the model hierarchy to inline editing also enables you to place an inline form in more than one “parent” model page, if desired.
We leave the detailed explanation of what each option does to the official documentation—and note there are some examples of admin usage in Part 3—but here’s a basic outline of what’s possible in each of the two main types of ModelAdmin
options.
List formatting: list_display
, list_display_links
, list_filter
, and similar options enable you to change the fields shown in list views (the default being simply the string representation of your model instances in a single column) as well as enabling search fields and filter links, so you can quickly navigate your information.
Form display: fields
, js
, save_on_top
, and others provide a flexible means of overriding the default form representation of your model, as well as adding custom JavaScript includes and CSS classes, which are useful if you want to try your hand at modifying the look and feel of the admin to fit the rest of your Web site.
Finally, realize if you find yourself making very heavy use of these options, it can be a sign to consider disregarding the admin and writing your own administrative forms. However, make sure you read the “Customizing the Admin” section of Chapter 11, “Advanced Django Programming,” first for tips on just how much you can flex the Django admin before setting out on your own.
Now that we’ve explained how to define and enhance your models, we go over the details of how to create, and then query, a database based on them, finishing up with notes on the raw SQL underpinnings of the overall mechanism.
As mentioned previously in Chapter 2, the manage.py
script created with every Django project includes functionality for working with your database. The most common manage.py
command is syncdb
. Don’t let the name fool you; it doesn’t do a full synchronization of your database with your models as some users can expect. Instead, it makes sure all model classes are represented as database tables, creating new tables as necessary—but not altering existing ones.
Therefore, if you create a model, run syncdb
to load it into the database, and later make changes to that model, syncdb
does not attempt to reconcile those changes with the database. It is expected that the developer makes such changes by hand or via scripts or simply dumps the table or database entirely and reruns syncdb
, which results in a fully up-to-date schema. For now, what’s important is that syncdb
is the primary method for turning a model class into a database table or tables.
In addition to syncdb
, manage.py
provides a handful of specific database-related functions which syncdb
actually builds upon to perform its own work. Table 4.1 shows a few of the more common ones. Among these are commands such as sql
and sqlall
, which display the CREATE TABLE
statements (sqlall
performs initial data loading as well); sqlindexes
for creating indexes; sqlreset
and sqlclear
, which empty or drop previously created tables; sqlcustom
, which executes an app’s custom initial SQL statements (see the following for more); and so forth.
Table 4.1. manage.py
Functions
| Description |
---|---|
| Create necessary tables needed for all apps |
| Display |
| Same as |
| Display the call(s) to create indexes for primary key columns |
| Display the |
| Combination of |
| Display custom SQL statements from |
| Load initial fixtures (similar to |
| Dump current database contents to JSON, XML, and so on |
Unlike syncdb
, these sql*
commands do not update the database on their own. Instead, they simply print out the SQL statements in question, enabling the developer to read them for verification’s sake (ensuring a later syncdb
does what the developer intends, for example) or save them to a stand-alone SQL script file.
It’s also possible to pipe these commands’ output into one’s database client for immediate execution, in which case they can act as more granular analogues to syncdb
. You can also combine the two approaches by redirecting to a file first, modifying that file, and then redirecting the file into the database for execution (see Appendix A, “Command Line Basics,” for more on pipes and redirection).
For more information on how to use these commands and the intricacies of syncdb
, see the example application chapters in Part 3 or visit the official Django documentation.
Querying your model-generated databases requires the use of two distinct, but similar, classes: Manager
s and QuerySet
s. Manager
objects are always attached to a model class, so unless you specify otherwise, your model classes each exhibit an objects
attribute, which forms the basis for all queries in the database concerning that model. Manager
s are the gateway to obtaining info from your database; they have a trio of methods that enable you to perform typical queries.
all
: Return a QuerySet
containing all the database records for the model in question.
filter
: Return a QuerySet
containing the model records matching specific criteria.
exclude
: The inverse of filter
—find records that don’t match the criteria.
get
: Obtain a single record matching the given criteria (or raise an error if there are either no matches or more than one).
Of course, we’re getting ahead of ourselves—we haven’t explained what a QuerySet
really is yet. QuerySet
s can be thought of as simply lists of model class instances (or database rows/records), but they’re much more powerful than that. Manager
s provide a jumping-off point for generating queries, but QuerySet
s are where most of the action really happens.
QuerySet
s are multifaceted objects, making good use of Python’s dynamic nature, flexibility, and so-called “duck typing” to provide a trio of important and powerful behaviors; they are database queries, containers, and building blocks all rolled into one.
As evidenced by the name, a QuerySet
can be thought of as a nascent database query. It can be translated into a string of SQL to be executed on the database. Because most common SQL queries are generally a collection of logic statements and parameter matches, it makes sense that QuerySet
s accept a Python-level version of the same thing. QuerySet
s accept dynamic keyword arguments or parameters that are translated into the appropriate SQL. This becomes obvious in an example using the Book
model class from earlier in this chapter.
from myproject.myapp.models import Book books_about_trees = Book.objects.filter(title__contains="Tree")
The keywords accepted are a mix of your model’s field names (such as title
in the previous example), double underscores for separation, and optional clarification words such as contains
, gt
for “greater than,” gte
for “greater than or equal to,” in
for set membership testing, and so forth. Each maps directly (or nearly so) to SQL operators and keywords. See the official documentation for details on the full scope of these operators.
Going back to our example, Book.objects.filter
is a Manager
method, as explained previously, and Manager
methods always return QuerySet
objects. In this case, we’ve asked the Book
default manager for books whose title contains the word “Tree” and have captured the resultant QuerySet
in a variable. This QuerySet
represents a SQL query that can look like this:
SELECT * FROM myapp_book WHERE title LIKE "%Tree%";
It’s entirely possible to make compound queries, such as one for the Person
model also defined previously:
from myproject.myapp.models import Person john_does = Person.objects.filter(last="Doe", first="John")
which would result in the following SQL:
SELECT * FROM myapp_person WHERE last = "Doe" AND first = "John";
Similar results appear when using other previous Manager
methods, such as all
:
everyone = Person.objects.all()
which turns into the unsurprising SQL query:
SELECT * FROM myapp_person;
It should be noted the various query-related options defined in the optional Meta
model inner class affect the generated SQL, as you can expect; ordering
turns into ORDER BY
, for example. And as we explore later, QuerySet
’s extra methods and composition capabilities also transmute the SQL, which is eventually executed on the database.
Finally, if you speak SQL yourself and understand the implications of various query mechanisms (both in terms of the result sets and the execution times), you will be better equipped to construct ORM queries, which are faster or more specific than ones you could otherwise have created. In addition, planned and current development work on Django makes it easier to pry open QuerySet
objects and tweak the resultant SQL—giving you more power than ever.
QuerySet
is list-like. It implements a partial list
interface and thus can be iterated over (for record in queryset:
), indexed (queryset[0]
), sliced (queryset[:5]
), and measured (len(queryset)
). As such, once you’re used to working with Python lists, tuples, and/or iterators, you already know how to use a QuerySet
to access the model objects within. Where possible, these operations are accomplished intelligently. For example, slicing and indexing make use of SQL’s LIMIT
and OFFSET
keywords.
On occasion, you can find you need to accomplish something with a QuerySet
that isn’t possible or desirable with the existing features Django’s ORM provides. In these cases, you can simply turn a QuerySet
into a list with list
, after which point it becomes a true list containing the entire result set. Although this is sometimes necessary or useful—such as when you want to do Python-level sorting—keep in mind this can cause a lot of memory or database overhead if your QuerySet
results in a large number of objects!
Django strives to provide as much power as possible with its ORM, so if you do find yourself thinking about casting to a list, make sure you spend a few minutes skimming this book or the official documentation, or poke around the Django mailing list archives. Chances are good you find a way to solve your problem without pulling the entire QuerySet
into memory.
QuerySet
is lazy; it only executes a database query when it absolutely has to, such as when it is turned into a list or otherwise accessed in ways mentioned in the previous section. This behavior enables one of the most powerful aspects of QuerySet
s. They do not have to be stand-alone, one-off queries, but can be composed into complex or nested queries. This is because QuerySet
exposes many of the same methods as Manager
s do, such as filter
and exclude
, and more besides. Just like their Manager
counterparts, these methods return new QuerySet
objects—but this time they are further limited by the parent QuerySet
’s own parameters. This is easier to understand with an example.
from myproject.myapp.models import Person doe_family = Person.objects.filter(last="Doe") john_does = doe_family.filter(first="John") john_quincy_does = john_does.filter(middle="Quincy")
With each step we cut down the query results by an order of magnitude, ending up with one result object, or at least very few, depending on how many John Quincy Does we have in our database. Because this is all just Python, you could use a one-liner.
Person.objects.filter(last="Doe").filter(first="John").filter(middle="Quincy")
Of course, the astute reader notices this provides nothing we couldn’t do with a single call to Person.objects.filter
. However, what’s to say we don’t get the earlier john_does QuerySet
in a function call or take it out of a data structure? In that case, we don’t know the specific contents of the query we’re handling—but we don’t always need to.
Imagine we have added a due_date
field to our Book
model and are responsible for displaying books that are overdue (defined, naturally, as books whose due date is earlier than today). We could be handed a QuerySet
containing all the books the library knows about, or all the fiction books, or all the books being returned by a specific individual—the point being that it’s some arbitrary collection of books. It’s possible to take such a collection and narrow it down to show only the books we’re interested in, namely the overdue ones.
from myproject.myapp.models import Book from datetime import datetime from somewhere import some_function_returning_a_queryset book_queryset = some_function_returning_a_queryset() today = datetime.now() # __lt turns into a less-than operator (<) in SQL overdue_books = book_queryset.filter(due_date__lt=today)
In addition to filtering in this manner, QuerySet
composition is absolutely required for nonsimple logical constructs, such as finding all the Book
s written by authors named Smith
and which are nonfiction.
nonfiction_smithBook.objects.filter(author__last="Smith").exclude(genre="Fiction")
Although it could have been possible to achieve the same result with query options for negation, such as __genre__neq
or similar (something Django’s ORM used to support in the past), breaking that logic out into extra QuerySet
methods makes things much more compartmentalized. It’s also arguably easier to read this way by breaking the query down into a few discrete “steps.”
It should be noted that QuerySet
s have a handful of extra methods that aren’t present on Manager
objects because they only serve to modify a query’s results and don’t generate new queries on their own. The most commonly used such method is order_by
, which overrides the default ordering of a QuerySet
. For example, let’s say our previous Person
class is normally ordered by last name; we can get an individual QuerySet ordered by first name instead, such as:
from myproject.myapp.models import Person all_sorted_first = Person.objects.all().order_by('first')
The resulting QuerySet
is the same as any other, but behind the scenes the SQL-level ORDER BY
clause was updated as we requested. This means we can continue to layer on more syntax for more complex queries, such as finding the first five people sorted by first name.
all_sorted_first_five = Person.objects.all().order_by('first')[:5]
You can even sort across model relationships by using the double-underscore syntax you’ve seen earlier. Let’s pretend for a moment our Person
model has a ForeignKey
to an Address
model containing, among other things, a state
field, and we want to order people first by state and then by last name. We could do the following:
sorted_by_state = Person.objects.all().order_by('address__state', 'last')
Finally, it’s possible to reverse a sort order by prefixing the identifying string with a minus sign, for example, order_by('-last')
. You can even reverse a whole QuerySet
(if, for example, your code was passing a QuerySet
around and you had no direct control over the previous call to order_by
) by calling a QuerySet
’s reverse
method.
Aside from ordering, there are a few other QuerySet
-only methods to consider, such as distinct
, which removes any duplicate entries in a QuerySet
by using SELECT DISTINCT
on the SQL side of things. Then there’s values
, which takes a list of field names (including fields on related objects) and returns a QuerySet
subclass, ValuesQuerySet
, containing only the requested fields as a list of dictionaries instead of normal model classes. values
also has a twin method called values_list
, which returns a list of tuples instead of dictionaries. Here’s a quick interactive example of values
and values_list
.
>>> from myapp.models import Person >>> Person.objects.values('first') [{'first': u'John'}, {'first': u'Jane'}] >>> Person.objects.values_list('last') [(u'Doe',), (u'Doe',)]
Another useful but often overlooked method is select_related
, which can sometimes help with a common ORM problem of having an undesirably large number of queries for conceptually simple operations. For example, if one were to loop over a large number of Person
objects and then display information on their related Address
objects (considering the scenario in the previous section), your database would be queried once for the list of Person
s, and then multiple times, one query per Address
. This would be a lot of queries if your list contained hundreds or thousands of people!
To avoid this, select_related
automatically performs database joins in the background to “prepopulate” related objects, so you end up with a single, larger query—databases are typically better performers with a few big queries than a ton of small ones. Note, however, select_related
does not follow relationships where null=True
is set, so keep that in mind if you’re designing a model layout geared for performance.
Some final notes on select_related
; you can control how “far” it reaches down a chain of relationships with the depth
argument to prevent a truly gigantic query from happening if you have a deep object hierarchy. Furthermore, you can select only a few specific relationships if you have a wider hierarchy with lots of links between objects by passing their field names as positional arguments.
As an example, the following is how one would use select_related
to do a simple join of our Person
with its address and to avoid any other defined ForeignKey
fields on either Person
or Address
:
Person.objects.all().select_related('address', depth=1)
Not a very exciting example, but that’s rather the point; select_related
and these other methods are only useful when you need to grab more or less than the query engine does by default. If you haven’t worked with medium or large Web sites before, these don’t seem too useful yet, but they’re indispensable once your application is fully developed, and you need to start worrying about performance!
Details on all these functions, as well as order_by
and reverse
, can be found on the official Django documentation.
QuerySet
is further augmented by a keyword-argument-encapsulating class named Q
, which enables even more complex logic, such as composition of AND and OR using the &
and |
operators (which, although similar, should not be confused with their equivalent Python operators, and
and or
or the bitwise &
and |
). The resulting Q
objects can be used in place of literal keyword arguments within filter
or exclude
methods, such as:
from myproject.myapp.models import Person from django.db.models import Q specific_does = Person.objects.filter(last="Doe").exclude( Q(first="John") | Q(middle="Quincy") )
Although that example is rather contrived—there probably aren’t many situations where you’d care about searching for a specific first or middle name—it should illustrate how Q
is used.
Like QuerySet
s themselves, Q
objects can be composited together over time. The &
and |
operators, when used on Q
s, return new Q
objects equivalent to their operands. For example, you can create potentially large queries via looping.
first_names = ["John", "Jane", "Jeremy", "Julia"] first_name_keywords = Q() # Empty "query" to build on for name in first_names: first_name_keywords = first_name_keywords | Q(first=name) specific_does = Person.objects.filter(last="Doe").filter(first_name_keywords)
As you can see, we created a short for
loop, primed it with the first item in our list, and then kept “appending” to the resulting Q
object by using the |
operator. This example actually isn’t the best—such a simple scenario would be served better by the __in
query operator—but hopefully it illustrates the potential power of composing Q
objects together programmatically.
We could have saved a few lines in the previous example by using some functional programming tools Python provides, namely list comprehensions, the built-in method reduce
, and the operator
module. The operator
module provides functional equivalents to operators, such as or_
for |
and and_
for &
. The three lines surrounding the for
loop could have been rewritten as reduce(or_,[Q(first=name) for name in first_names])
. As always, because Django is “just Python,” this sort of approach can be applied to just about any aspect of the framework.
Finally, you can use the single-operand operator ~
with Q
objects to negate their contents. Although the QuerySet exclude
method is a more common solution for such queries, ~Q
shines when your query logic gets a bit more complex. Take for example this compound one-liner that grabs all the Does, plus anyone named John Smith, but not anyone named John W. Smith.
Person.objects.filter(Q(last="Doe") | (Q(last="Smith") & Q(first="John") & ~Q(middle__startswith="W")) )
Tacking on exclude(middle_startswith="W")
to such a query wouldn’t have quite done the trick—it would have excluded any Does with a middle initial of “W,” which is not what we want—but we were able to express our specific intentions with ~Q
.
As a final word on what you can accomplish with Django’s query mechanisms (and a lead-in to the next section about what they aren’t currently capable of), we examine the QuerySet
method extra
. It’s a versatile method, which is used to modify aspects of the raw SQL query that is generated by your QuerySet
, accepting four keyword arguments that we describe in Table 4.2. Note the examples in this section can make use of attributes that were not defined in earlier model examples, for the sake of being more illustrative.
The select
parameter expects a dictionary of identifiers mapped to SQL strings, which enables you to add custom attributes to the resultant model instances based on SQL SELECT
clauses of your choosing. These are handy when you want to define simple additions to the information you pull out of the database and limit those to only a few parts of your code (as opposed to model methods, which execute their contents everywhere). In addition, some operations are simply faster in the database than they would be in Python, which can be useful for optimization purposes.
Here’s an example of using select
to add a simple database-level logic test as an extra attribute:
from myproject.myapp.models import Person # SELECT first, last, age, (age > 18) AS is_adult FROM myapp_person; the_folks = Person.objects.all().extra(select={'is_adult': "age > 18"}) for person in the_folks: if person.is_adult: print "%s %s is an adult because they are %d years old." % (person.first, person.last, person.age)
The where
parameter takes as input a list of strings containing raw SQL WHERE clauses, which are dropped straight into the final SQL query as-is (or almost as-is; see params
in the following). where
is best used in situations when you simply can’t make the right query by using attribute-related keyword arguments such as __gt
or __icontains
. In the following example, we use the same SQL-level construct to both search by, and return, a concatenated string using PostgreSQL-style concatenation with ||
:
# SELECT first, last, (first||last) AS username FROM myapp_person WHERE # first||last ILIKE 'jeffreyf%'; matches = Person.objects.all().extra(select={'username': "first||last"}, where=["first||last ILIKE 'jeffreyf%'"])
Possibly the simplest extra
parameter is tables
, which enables you to specify a list of extra table names. These names are then slotted into the FROM
clause of the query, often used in tandem with JOIN statements. Remember by default, Django names your tables as appname_modelname
.
Here’s an example of tables
, which deviates a bit from the rest (and returns to the Book
class with an additional author_last
attribute) for brevity’s sake:
from myproject.myapp.models import Book # SELECT * FROM myapp_book, myapp_person WHERE last = author_last joined = Book.objects.all().extra(tables=["myapp_person"], where=["last = author_last"])
Finally, we come to the params
argument. One of the “best practices” of performing database queries from higher-level programming languages is to properly escape or insert dynamic parameters. A common mistake among beginners is to do simple string concatenation or interpolation to get their variables into the SQL query, but this opens up a whole host of potential security holes and bugs.
Instead, when using extra
, make use of the params
keyword, which is simply a list of the values to use when replacing %s
string placeholders in the where
strings, such as:
from myproject.myapp.models import Person from somewhere import unknown_input # Incorrect: will "work", but is open to SQL injection attacks and related problems. # Note that the '%s' is being replaced through normal Python string interpolation. matches = Person.objects.all().extra(where=["first = '%s'" % unknown_input()]) # Correct: will escape quotes and other special characters, depending on # the database backend. Note that the '%s' is not replaced with normal string # interpolation but is to be filled in with the 'params' argument. matches = Person.objects.all().extra(where=["first = '%s'"], params=[unknown_input()])
The final word on Django’s model/query framework is that, as an ORM, it simply can’t cover all the possibilities. Few ORMs claim to be a 100 percent complete replacement for interfacing with one’s database via the regular channels; Django is no different, although the developers are always working to increase its flexibility. Sometimes, especially for those with extensive prior experience with relational databases, it’s necessary to step outside the ORM. The following sections are a few thoughts on how this is possible.
Aside from standard tables and columns, most RDBMS packages provide additional features such as views or aggregate tables, triggers, the capability to define “cascade” behavior when rows are deleted or updated, and even custom SQL-level functions or datatypes. Django’s ORM—like most others—is largely ignorant of such things at the time of this writing, but that doesn’t mean you can’t use them.
One recently added aspect of Django’s model definition framework is the capability to define custom initial SQL files, which must be .sql
files within a sql
subdirectory of an application, such as myproject/myapp/sql/triggers.sql
. Any such file is automatically executed on your database whenever you run manage.py
SQL-related commands such as reset
or syncdb
and is included in the output of sqlall
or sqlreset
. The feature has its own manage.py
command, sqlcustom
, which (such as the other sql*
commands) prints out the custom SQL it finds.
Through use of initial SQL, you can store schema definition commands within your Django project and know that they are always included when you use Django’s tools to build or rebuild the database. Most of the following bullet points can be accomplished by making use of this feature:
Views: Because SQL views are effectively read-only tables, you can support them by creating Model definitions to mirror their layout, and then use the normal Django query API to interact with them. Note you need to be careful not to execute any manage.py
SQL-related commands that would attempt to write such a model to the database, or you’d run into problems. As with any SQL library accessing such views, attempts to write to the view’s table results in errors.
Triggers and cascades: Both work just fine with inserts or updates generated by normal ORM methods and can be defined via initial SQL files, depending on your database (cascade constraints can be manually added to the output of manage.py sqlall
, if they cannot be created after the fact).
Custom functions and datatypes: You can define these in initial SQL files, but need to make use of QuerySet.extra
to reference them from within the ORM.
Although not technically a SQL extra per se, we’d like to include in this section a look at a Django feature related to working with your database outside the ORM itself: fixtures. Fixtures, touched on briefly in Chapter 2, are a name for sets of database data stored in flat files, which are not usually raw SQL dumps, but a database-agnostic (and often human-readable) representation, such as XML, YAML, or JSON.
The most common use for fixtures is as initial data loaded into a database when it is created or re-created, such as “prefilling” database tables used for categorizing user-entered data or loading up test data during application development. Django supports this in a similar fashion to the initial SQL outlined previously; each Django app is searched by a fixtures
subdirectory, and within it, a file named initial_data.json
(or .xml
, .yaml
, or another serialization format). These files are then read using Django’s serialization module (see Chapter 9, “Liveblog,” for more on this topic) and their contents used to create database objects, whenever database create/reset commands are run, such as manage.py syncdb
or reset
.
Here’s a quick example of what a simple JSON fixture file for our Person
model class can look like:
[ { "pk": "1", "model": "myapp.person", "fields": { "first": "John", "middle": "Q", "last": "Doe" } }, { "pk": "2", "model": "myapp.person", "fields": { "first": "Jane", "middle": "N", "last": "Doe" } } ]
and the output from importing it into our database:
user@example:/opt/code/myproject $ ./manage.py syncdb Installing json fixture 'initial_data' from '/opt/code/myproject/myapp/fixtures'. Installed 2 object(s) from 1 fixture(s)
In addition to initial data, fixtures are also useful as a more ‘neutral’ (although sometimes less efficient or specific) database dump format than using your database’s SQL dump tool—for example, you could dump a Django application’s data from a PostgreSQL database and then load it into a MySQL database, something that’s not nearly as easy without the intermediate translation step of fixtures. This is accomplished with manage.py dumpdata
and/or loaddata
.
When using dumpdata
and loaddata
, the location and name of the fixtures used is more flexible than with initial data. They can have any name (as long as the file extension is still that of a supported format) and can live in the fixtures
subdirectory, any directory specified in the FIXTURES_DIRS
setting, or even on an explicit path provided to loaddata
or dumpdata
. For example, we can dump out the two Person
objects imported previously, like so (using the indent
option to make the output more human-readable).
user@example:/opt/code/myproject $ ./manage.py dumpdata --indent=4 myapp > /tmp/myapp.json user@example:/opt/code/myproject $ cat /tmp/myapp.json [ { "pk": 1, "model": "testapp.person", "fields": { "middle": "Q", "last": "Doe", "first": "John" } }, { "pk": 2, "model": "testapp.person", "fields": { "middle": "N", "last": "Doe", "first": "Jane" } } ]
As you can see, fixtures are a useful way of dealing with your data in a format that’s a bit easier to work with than raw SQL.
Finally, it’s important to remember if the ORM (including the flexibility provided by extra
) doesn’t meet your query-related needs, it’s always possible to execute fully custom SQL by using a lower-level database adapter. Django’s ORM uses these modules to interface to your database as well. These modules are database-specific, depending on your Django database setting, and likely conform to the Python DB-API specification. Simply import the connection
object defined in django.db
, obtain a database cursor from it, and query away.
from django.db import connection cursor = connection.cursor() cursor.execute("SELECT first, last FROM myapp_person WHERE last='Doe'") doe_rows = cursor.fetchall() for row in doe_rows: print "%s %s" % (row[0], row[1])
See the Python DB-API documentation, the “Database” chapter of Core Python Programming, and/or the database adapter documentation (see withdjango.com) for details on the syntax and method calls provided by these modules.
We’ve covered a lot of ground in this chapter, and with luck, you’ve come out of it with an appreciation for the amount of thinking that can (and usually should) go into one’s data model. You’ve learned how ORMs can be useful, learned how to define simple Django models as well as more complex ones with various relationships, and seen the special inner classes Django uses to specify model metadata. In addition, we hope you’re convinced of the power and flexibility of the QuerySet
class as a means of pulling information out of your models and understand how to work with your Django application’s data outside the ORM itself.
In the next two chapters, Chapters 5, “URLs, HTTP Mechanisms, and Views” and 6, you learn how to make use of your model in the context of a Web application by setting up queries in your controller logic (views) and then displaying them in your templates.
3.22.116.143