Building A Full Stack Web Application – Part 3: Data Context

This post is a continuation in a series about building a full stack web application.  Previous posts in the series:

Microsoft’s Entity Framework (EF) is built around the concept of a data context, which is essentially a bridge between a database and classes, allowing us to interact with data as objects.  The EF class DbConext allows us to query and persist data in an object-oriented way, without having to deal with yucky SQL statements, database connection strings or transactions.

Context 

Back to Stocks Tracker.  Now that the data was modeled I added another class library project – Data.Context – which would be responsible for generating the database tables, keys and indexes, as well as seeding the database with any reference data.  This project provides a custom abstraction of DbContext, available to other layers.

But wait.  Remember how in a previous post, I showed how Microsoft’s Identity framework can be used to generate security classes that work hand-in-hand with Identity?  Well Identity also provides its own implementation of DbContext, IdentityDbContext, which allows us to interact with security data as objects.  Since I want to be able to query security objects via EF, the Stocks Tracker context will actually be derived from IdentityDbContext.

Let’s look a little closer at the context Stocks Tracker exposes to its callers.


/// <summary>
/// Class provides access to Entity Framework CRUD actions within the Stocks Tracker domain.
/// </summary>
public class StocksTrackerContext : IdentityDbContext<ApplicationUser>, IStocksTrackerContext

As you can see, StocksTrackerContext implements IdentityDbContext, which will use the ApplicationUser class defined in the Data.Models project. StocksTrackerContext also implements an interface, so that callers are limited to the data I want them to see.

public StocksTrackerContext()
: base("name=StocksTrackerContext")
{
}

public StocksTrackerContext(string connectionString, bool eagerOpen)
: base(connectionString)
{
_eagerOpen = eagerOpen;
if (eagerOpen)
Database.Connection.Open();
}

The constructors for DbContext are interesting.  Passing “name=StocksTrackerContext” forces EF to search the project for a connectionStrings node in a configuration file, and attempts to use the connection string matching the name to connect to the database.  The context can throw errors when called from a different project without a configuration file, so rather than duplicate config values all over the place, I prefer the second constructor, where an explicit connection string can be provided.


public IDbSet<Stock> Stocks
{
get { return _stocks ?? (_stocks = Set<Stock>()); }
}

protected override void OnModelCreating(DbModelBuilder modelBuilder)
{
modelBuilder.Configurations.Add(new StockConfiguration());
modelBuilder.Configurations.Add(new StockTrackerConfiguration());
modelBuilder.Configurations.Add(new StockTrackerStockConfiguration());
base.OnModelCreating(modelBuilder);
}

EF provides its own implementation of IEnumerable in the generic IDbSet interface.  This provides LINQ functionality on the collection, but with the ability to add and remove objects from the underlying collection.

But first we need a database.  EF provides a ton of configuration options for dealing with data, but I like to keep it simple.  Basically it boils down to configuring your entities, and migrating your data.  For configuration, EF provides a generic base class EntityTypeConfiguration, which for a model allows you to set constraints on tables (keys, name) and columns (size, nullability).

Configuration

/// <summary>
/// Defines the entity type configuration for the <see cref="Stock"/> domain model.
/// </summary>
public class StockConfiguration : EntityTypeConfiguration<Stock>
{
/// <summary&amp;gt;
/// Instantiates the StockConfiguration class.
/// </summary&amp;gt;
public StockConfiguration()
{
HasKey(s => s.StockId);

Property(s => s.TickerSymbol)
.IsRequired()
.HasMaxLength(10);

Property(s => s.OpenPrice)
.IsOptional()
.HasPrecision(8, 2);

HasMany(s => s.StockTrackerStocks)
.WithRequired(sts => sts.Stock)
.HasForeignKey(fk => fk.StockId);

ToTable("Stock");
}
}

With this in place EF has what it needs to translate a configuration class into DDL statements used to create or update a database schema.  Plus, EF tracks schema changes for you, and will automatically drop or add columns as the configuration class evolves.

Migration

For migrating data, EF provides an extensible base class DbMigrationsConfiguration, which requires a generic class of type DbContext.  This is where you would pass your custom implementation of DbContext, for this example StocksTrackerContext.

DbMigrationsConfiguration provides a single overridable method called Seed, passing the concrete implementation of DbContext.  From there any statements needed to “seed” the database with reference data or default users can be executed via the context.


/// <summary>
/// Class for migrating data to the Stocks Tracker domain.
/// </summary>
public class StocksTrackerMigrationsConfiguration: DbMigrationsConfiguration<StocksTrackerContext>

protected override void Seed(StocksTrackerContext context)
{
   // query, add, edit or delete data as needed
}

Initialization

With the context, configuration and migration classes in place, all that’s missing is a process that ties them all together.  Luckily, EF provides a number of options for initializing the database.  For Stocks Tracker I used the MigrateDatabaseToLatestVersion initializer, which is able to detect configuration changes and update the database schema accordingly (the Seed method will also be called during this process).


Database.SetInitializer(
 new MigrateDatabaseToLatestVersion<StocksTrackerContext, StocksTrackerMigrationsConfiguration>());

What’s nice about this is that the initialization can be called from within some bootstrapping code, or compiled into an executable called during an installation or some other process. I made a simple console project that initializes the database, then writes data to the console, so I know when there’s an error with the configuration or migration. The options EF provide make it easy to configure the database and context to work in development, testing or production environments.

Summing Up

As I mentioned, EF provides many options for creating a context that allows your application to connect to a data store, including the exact opposite of what the Stocks Tracker does (starting with a created and populated database, then generating configuration and context classes).

One cool option is Simon Hughes’ Reverse POCO Generator, which given an existing database will reverse engineer and generate POCO, configuration and context classes from tables and views, and will even generate classes for using stored procedures.  A familiarity with T4 templates is a big help, as most generators leverage T4s to some extent.

That was a lot to cover, but it’s enough for what I want Stocks Tracker to accomplish.  Next, we’ll look at the business layer.

Building A Full Stack Web Application – Part 2: Modeling the Data

This post is a continuation in a series about building a full stack web application.  Previous posts in the series:

There were a couple of goals I wanted to accomplish with the Stocks Tracker data.  The first goal was to keep any data access or modeling classes isolated in their own projects, accessible only through abstractions, so that any layer(s) reliant on data would have no knowledge of how the underlying data store worked.  The second goal was to leverage Microsoft’s Entity Framework (EF) and Identity frameworks.

Models

For data modeling, I added a new class library project to the Stocks Tracker solution, and called it Data.Models.  The data model was pretty simple, and only required adding a few NuGet packages, so that EF and Identity could work together:


<?xml version="1.0" encoding="utf-8"?>
<packages>
<package id="EntityFramework" version="6.1.0" targetFramework="net45" />
<package id="Microsoft.AspNet.Identity.Core" version="2.0.1" targetFramework="net45" />
<package id="Microsoft.AspNet.Identity.EntityFramework" version="2.0.1" targetFramework="net45" />
</packages>

The classes were simple POCOs that EF needs to define and map entity relationships to an underlying data store.  Pretty standard stuff, if you prefer EF’s code-first approach to generating database schemas.  However, throw in Identity and EF will also generate objects used by the Identity framework for security, such as tables to store user accounts, as well as support for third-party logins, such as Google, right out of the box.  Which is pretty cool, and necessary for a modern Web application.

The only thing needed was to create a class derived from IdentityUser, and EF is smart enough to create a security schema containing all the IdentityUser properties, as well as any custom properties defined in the derived class.  Mine adds the properties FirstName and LastName, which will be added as columns to the table AspNetUsers.


/// <summary>
/// Encapsulates the properties of an application user object.
/// </summary>
public class ApplicationUser : IdentityUser
{
/// <summary>
/// Instantiates the ApplicationUser class.
/// </summary>
public ApplicationUser()
{
}

/// <summary>
/// Gets and sets the first name value.
/// </summary>
public string FirstName { get; set; }

/// <summary>
/// Gets and sets the last name value.
/// </summary>
public string LastName { get; set; }
}

And that’s it for data modeling.  Up next, the data context.

Building A Full Stack Web Application – Part 1: Why

When I made the decision to move my career away from Windows desktop application development to web-based programming, I had a monumental task ahead of me.  I had almost zero knowledge of how the Web worked.  I knew a little bit of HTML; JavaScript was clunky, to be avoided at all cost; and CSS was a complete mystery.

To make matters worse, my first foray into the Web was trying to learn ASP.NET’s Web Forms.  Man, did that suck.  A confusing melding of tightly coupled business logic, DOM element manipulation and styling (sometimes even JavaScript!) smashed into a code-behind file, along with an “HTML” page of elements that talked to the code-behind (or didn’t).

Throw in a confusing encapsulation of the request/response cycle as a series of page-level events, plus controls that rendered HTML in a black box, and what you got was a working web page with no knowledge of the Web.  Instead, you just learned Web Forms.  No dice.

Enter ASP.NET MVC.  Now this was something I could sink my teeth into!  An architecture that embraced a clean separation of responsibilities by following the Model-View-Controller design pattern.  Absolute control over the HTML rendered.  An unobtrusive method of client/server communication via the Razor syntax.  A design that favored a “convention vs. configuration” model, but was still easily extensible.

Still, most MVC training and guides crammed everything from data access to business logic into the controllers, making them an untestable mess.  I was coming from a world of software best practices, that embraced the SOLID principles, as well as putting a premium on unit testing.  Could I apply these practices and methodologies to web applications as successfully as I had to desktop applications?

Well, I was going to give it a shot.  After a lot of thought and research I came up with an idea for an application, and a plan of making it happen.  The application would be structured in the classic layers of a full stack platform: data layer; business layer; API; presentation/UI layer; and a network layer for hosting the whole thing.

Microsoft technologies would provide a measure of familiarity, but open source packages would be used where appropriate.  For source control, I chose GitHub due to its cost (nothing) and ease.  My IDE would be Visual Studio, and my database SQL Server.  Autofac would be my IoC container, as it was made to work with the .NET Framework.

My application would allow people to search for and track their favorite stocks.  I would imaginatively name it the Stocks Tracker.  This is how I built it.

Up first, the data layer.