재우니의 블로그


Reference Solution: ASP.NET 2.0 Internet Security Reference Implementation

Jason Taylor, Greg Foltz

Applies To

* Pet Shop 4
* ASP.NET 2.0
* SQL Server 2005
* Windows Server 2003

Summary

This solution architecture illustrates and implements security guidance as it applies to an Internet facing ASP.NET 2.0 web application. The guidelines in this document are based upon “Security Guidelines: ASP.NET 2.0” and “Security Guidelines: .NET Framework 2.0”, however they are specific to the Internet scenario and are placed within the context of a full sample application in a development environment. See, “Scenario and Solution: Forms Auth to SQL, Roles in SQL” to view this guidance as it applies to the Internet, Forms Authentication scenario.

This solution architecture uses the Pet Shop 4 sample application as a baseline. Pet Shop 4 has been updated in order to represent a reference implementation for ASP.NET 2.0 security best practices. These implemented security best practices are commented in the code and described in detail below.

See the Additional Resources section below for more information on other implementations customizing Pet Shop 4 for different scenarios.

Contents

* How To Use This Module
* Internet Scenario
* Key Use Cases
* Requirements
* Solution
* Problem and Solution Table
* Solution Rationalization Table
* Solution Implementation
* Authentication
* Authorization
* Input Validation
* Data Access
* Exception Management
* Sensitive Data
* Auditing and Logging
* Companion Guidance
* Additional Resources
* Appendix

How to Use This Module

To get the most from this module:
* Understand the application scenario and architecture. The design and implementation decisions described below are specific to the application scenario and architecture.
* Use the solution summary table. Use the summary table to review the security decisions made. The summary table contains all decisions made for the reference application and the rationale behind each.
* Learn the solution details. Browse the details in this document to learn more about each security decision.
* Browse the reference implementation source code. The Pet Shop 4 Security Reference Implementation is included in this download. Use the source code in conjunction with this document to fully understand the security decisions. Each security best practice is clearly annotated in the code with comments.
* Use the Companion documents. See, “Scenario and Solution: Forms Auth to SQL, Roles in SQL” to learn more about the scenario the reference application is running in.

Internet Scenario

In this scenario, the machine hosting the web application is a dedicated web server hosted by the organization that developed the application. The application is accessed over the Internet by the general public using a variety of browsers. It is administered by the organization’s IT department.


Key Use Cases

Pet Shop 4 is an e-commerce web application that allows customers to browse and purchase pets. It has the following functionality:
* View products by category
* View products details
* Search for products by keyword
* Add/Remove/Edit items in a shopping cart
* Add/Remove items in a wish list
* Transfer a wish list to a shopping cart
* Register as a user
* Edit user profile
* Login and logout
* Checkout and purchase product

You can find more information on the Pet Shop 4 download page.

Requirements

A brief Security Objectives exercise on Pet Shop 4 revealed the following key security requirements:
* Anonymous users should be able to browse but not purchase products
* Anonymous or authenticated users should not be able to view or modify the contents of another user’s shopping cart
* Anonymous or authenticated users should not be able to view or modify the contents of another user’s wish list
* Product pricing and other product details, should not be modified by any users
* Authenticated user credentials should be protected from disclosure and tampering
* Shopping cart contents should be protected from disclosure and tampering
* Wish list contents should be protected from disclosure and tampering
* Credit card information should be protected from disclosure and tampering
* Customer profile data should be protected from disclosure and tampering
* Sensitive data stored in database including item inventory and orders should be protected from disclosure and tampering
* Sensitive data including credit card info, inventory, orders in transit should be protected from disclosure or tampering.

All of the security requirements are met by the Pet Shop 4 security reference implementation that is a companion to this document.

Solution

The Web application uses forms authentication configured with the SQLMembershipProvider. The SQLRoleProvider is used for user authorization. Role and membership information is stored in a SQL database.

Problem and Solution Table

(Key Issues and Decisions)
How to authenticate callers?
* Use Forms authentication
* Use SQL membership provider instead of custom authentication
* Use SSL to protect credentials and authentication cookies
* Validate user login information
* Do not store passwords directly in the user store
* Enforce strong passwords
* Protect access to the credential store
* Do not persist authentication cookies
* Set httpOnly on authentication cookies
* Use unique cookie names and paths
How to authorize?
* Use URL authorization for page and directory access control
* Use ASP.NET role manager for roles authorization
* Disable role caching
* Never use GET requests for write operations
* Set Page.ViewStateUserKey to a unique id for each user.
How to validate input?
* Validate input for length, range, format, and type
* Properly encode all data from un-trusted sources
How to perform data access?
* Encrypt connection strings in web.config
* Use least-privileged accounts for database access
* Use Windows authentication to connect to SQL
* Use a trusted service account instead of constrained delegation
* Use type safe SQL parameters and do not use dynamic queries
How to handle exceptions?
* Use structured exception handling
* Do not reveal exception details to the client
* Use a global error handler to catch unhandled exceptions
How to handle sensitive data?
* Encrypt credit card data in the database
* Protect credit card data over the wire using SSL
* Do not store credit card data in ViewState
* Do not cache credit card data
How to handle auditing and logging?
* Use health monitoring to log registration of new users, failed login attempts, invalid credit card numbers and order placement
* Log using PetShop4 as the event source

Solution Rationalization Table

The following table summarizes each security decision that was made and why it was made.
(Decision and Why)
*Authentication *
* Use Forms authentication
* This scenario does not include an Active Directory implementation, application users are not in a domain, ruling out Windows Authentication.
* Forms authentication integrates easily with login controls and membership providers.
* Use the SQL Membership Provider instead of custom authentication
* Protects user credentials.
* Enforces strong passwords.
* Provides consistent APIs for authentication tasks.
* Use SSL to protect credentials and authentication cookies
* Protects user credentials from theft.
* Protects authentication cookie from being reused by an attacker.
* Validate user login information
* Protects against code injection attacks such as cross site scripting and SQL injection.
* Do not store passwords directly in the user store
* Protects password data if the user store is compromised.
* Enforce strong passwords
* Makes brute force and dictionary attacks prohibitively expensive.
* Protect access to the credential store
* Prevents unauthorized access or manipulation of user data.
* Do not persist authentication cookies
* Protects against identity spoofing attacks.
* Set HttpOnly on authentication cookies
* Protects the authentication cookie from cross-site scripting attacks.
* Use unique cookie names and paths
* Prevents unauthorized access to cookie values.

Authorization
* Use URL authorization for page and directory access control
* Access control is implemented with minimal effort.
* Use SQL role manager for roles authorization
* Privileges can be granted to roles instead of individual users, improving flexibility.
* No need to write custom code for roles management.
* Disable role caching
* Increased security since the role cookie does not exist to be spoofed.
* Never use GET requests for write operations
* Prevents zero-click elevation of privilege attacks.
* Set Page.ViewStateUserKey to a unique id for each user.
* Prevents one-click elevation of privilege attacks.
*Input and Data Validation *
* Do not rely solely on ASP.NET input Validation
* Protects against code injection attacks, such as SQL Injection and Cross Site Scripting.
* Helps to protect against denial of service attacks.
* Do not rely on client-side validation
* Unlike server-side validation, it is easily bypassed.
* Validate input from all sources like QueryString, cookies, and HTML controls
* Protects against code injection attacks, such as SQL Injection and Cross Site Scripting.
* Properly encode all data from un-trusted sources
* Protects against Cross Site Scripting attacks.

Data Access
* Encrypt connection strings in web.config
* Protects server, database names and credentials from being stolen if the web.config file is compromised.
* Use Windows authentication to connect to SQL
* Credentials are at less risk of being stolen.
* Use least-privileged accounts for database access
* Reduces impact of SQL Injection attacks.
* Use a trusted service account instead of constrained delegation
* Improves scalability due to connection pooling.
* Use type safe SQL parameters and do not use dynamic queries
* Protects against SQL injection attacks.

Exception Management
* Use structured exception handling
* Protects against denial of service attack by disposal of unmanaged resources
* Prevents information disclosure.
* Do not reveal exception details to the client
* Deprives attackers of implementation details that could be used to launch attacks.
* Use a global error handler to catch unhandled exceptions
* Deprives attackers of implementation details that could be used to improve attack.
Sensitive Data
* Encrypt credit card data in the database
* Protects credit card information if stolen from the database.
* Protect credit card data over the wire using SSL
* Protects credit card data from theft when being transmitted between the client and the server.
* Do not store credit card data in ViewState
* Protects credit card data from theft when transmitted between the client machine and the server.
* Do not cache credit card data
* Protects credit card data from being stolen from the client machine.

Auditing and Logging
* Use health monitoring to log important security events
* Detection of suspicious activity.
* Log using PetShop4 as the event source
* Separates PetShop4 logs from all other ASP.NET logs

Implementation

The following sections walk through the key security decisions that were made during creation of the Pet Shop 4 security reference implementation. These decisions are organized by categories that represent the areas where mistakes are most often made. Each decision includes a description of how we implemented the change, why we made the decision, a list of benefits realized in the reference implementation as well as liabilities to be aware of.

Authentication

Authentication is required for positive identification of users. Authentication guards against unauthorized access and modification of user specific data such as user profile, and orders. The reference implementation does not allow unauthenticated users to purchase products or access user specific information.

Key engineering decisions:
* Use Forms authentication
* Use SQL Membership Provider instead of custom authentication
* Use SSL to protect credentials and authentication cookies
* Do not store passwords directly in the user store
* Enforce strong passwords
* Protect access to the credential store
* Do not persist authentication cookies
* Set HttpOnly on authentication cookies
* Use unique cookie names and paths

Use Forms Authentication

The reference implementation uses forms authentication.

How it was Implemented
The following steps explain how forms authentication was implemented for this reference implementation:

1. Configure Forms Authentication in web.config
To configure Forms Authentication, the following settings were added to web.config. The individual properties for this configuration are discussed in subsequent sections.


<authentication mode="Forms">
		      <forms name="PetShopAuth" 
		             loginUrl="~/SignIn.aspx" 
		             protection="All" 
		             requireSSL="true" 
		             timeout="30"
		      />
		    </authentication> 
	

2. Create Authentication Pages
After configuring the application for forms authentication it is necessary to create the login pages. The following pages are implemented in this solution architecture to support forms authentication:

* SignIn.aspx – Uses the Login control to present the user with a simple username and password interface. Upon submitting the form, the user’s credentials are verified against the SQL membership provider. If the credentials are valid, the form returns an authentication cookie that is sent by the browser during subsequent requests to identify the user.

<asp:Login ID="Login" runat="server" CreateUserUrl="~/NewUser.aspx"…

* NewUser.aspx – Uses the CreateUserWizard control to walk a new user through the steps of creating a new account. The control passes the information provided by the user to the SQL membership provider, which stores the data in the database.

<asp:CreateUserWizard ID="CreateUserWizard" CreateUserButtonText="Sign Up"…

* MasterPage.master – Uses the LoginStatus control to allow the user to sign-in or sign-out of the website. The Sign In link takes the user to the SignIn.aspx page while the Sign Out link instructs the browser to clear the authentication cookie in the user’s browser cache.

<asp:LoginStatus ID="lgnStatus"
runat="server" CssClass="link" LoginText="Sign In"
LogoutAction="Refresh" LogoutText="Sign Out"
		   />
	

Why
Forms authentication was used in this scenario because the application’s users will not be in an Active Directory or a Windows domain. However, forms authentication does not protect user credentials and it does not enforce strong passwords. Integrate forms authentication with ASP.NET Membership, Role Manager and SSL in order to protect against these risks.
Benefits
* Reduces attack surface as the application accounts are not tied to Windows system accounts.
* Forms authentication integrates easily with login controls and membership providers.
Liabilities
Forms authentication in this scenario has some drawbacks. It does not:
* Validate username or password
* Enforce strong passwords
* Protect user credentials or the authentication cookie - they are transmitted in clear text
* Provide server side logic for validating user credentials.

These issues are addressed through use of the ASP.NET Membership Provider and the use of SSL to encrypt sensitive data before it is transmitted.

More Information
* For more information on using forms authentication see, “How To: Use Forms Authentication with SQL Server in ASP.NET 2.0”.


Use SQL Membership Provider Instead of Custom Authentication

The reference implementation is configured to use the SQL Membership Provider.


How it was Implemented
The following steps explain how membership was implemented for this reference implementation:

1. Create the Membership database
The SQL Role and membership providers require a specific database schema to operate. This database is generated automatically when you install this solution architecture. You can configure it manually using the ASPNet_regsql.exe command line utility. The following parameters create the tables necessary to support the Membership and Role providers in a database named “MSPetShop4Services” hosted by SQLExpress.

aspnet_regsql –S .\sqlexpress –A mr –d MSPetShop4Services

2. Configure the SQL Membership Provider in the web.config
The SQL Membership provider is configured in the web.config. The connection string tells ASP.NET how to connect to the SQL database that stores account information.

<add name="SQLMembershipConnString" connectionString="server=.\sqlexpress;database=MSPetShop4Services;integrated security=SSPI;min pool size=4;max pool size=4;" providerName="System.Data.SqlClient"/>

3. Specify user account options
The membership tag instructs ASP.NET to use the SQL Membership provider. This tag references the connection string to use and specifies options that control how user accounts are treated.

<membership defaultProvider="SQLMembershipProvider">
		           <providers>
	
<add name="SQLMembershipProvider" type="System.Web.Security.SqlMembershipProvider" connectionStringName="SQLMembershipConnString" applicationName=".NET Pet Shop 4.0" enablePasswordRetrieval="false" enablePasswordReset="true" requiresQuestionAndAnswer="false" requiresUniqueEmail="false" passwordFormat="Hashed"/>
		           </providers>
		 </membership>
	

Why
The ASP.NET 2.0 membership feature helps protect credentials, can enforce strong passwords, and integrates with the ASP.NET login controls. By using the membership feature, the reference implementation gains a secure solution for user authentication and management with very little development effort.
Benefits
* Consistent APIs for authentication tasks
* Provider model makes it easy to change the provider used
* Easy integration with login controls
* Protects user credentials
* Enforces strong passwords
* Password strength rules can be configured
* Password format for storing the password in the database can be configured.

Liabilities
* The authentication cookies are not protected by default over the wire, opening the application up to cookie replay attacks. This issue is addressed by using SSL

More Information
* For more information on using the ASP.NET Membership features see, “How To: Use Membership in ASP.NET 2.0”.


Use SSL to Protect Credentials and Authentication Cookies

The reference implementation uses SSL to protect sensitive data on the network

How it was Implemented
For instructions on setting up IIS to use SSL when using this reference implementation, see, “SSL Instructions”.

The following steps explain how SSL was enabled for this reference implementation:

1. Set requireSSL to true
The requireSSL tag is set in web.config. This setting prevents the browser from sending the authentication cookie over an unsecured link.

<forms name="PetShopAuth"
		             loginUrl="SignIn.aspx" 
		             protection="All" 
		             requireSSL="true" timeout="60"/>
	

2. Partition the site into restricted areas and public areas
All pages that require authenticated access are placed in either the Restricted or Admin folders. Pages and resources in other directories are anonymously accessible. By partitioning the site, SSL is used only on those pages that need it. Since the use of SSL has a significant overhead, using it sparingly will improve system performance.

Why
SSL is used to protect both user credentials and authentication cookies from theft as they are transmitted. A common threat to web applications is identity spoofing, allowing an attacker to assume the identity of a valid user. In PetShop 4, identity spoofing could be used to steal personal information, such as credit card info, address and telephone numbers, from another user.

Forms Authentication using SQL Membership provider sends credentials and cookies in plain-text to the server. This makes it possible to use network monitoring tools to steal the data as it travels over the network. Using SSL protects this data cryptographically and prevents an attacker from deciphering the data in some finite time.

The requireSSL option ensures that cookies are not sent when accessing non-HTTPS anonymous pages. This protects the cookie from inadvertently being sent in clear text.
Benefits
* Protects user credentials from theft
* Protects authentication cookie from being reused by an attacker
Liabilities
SSL has some drawbacks:
* There is a performance impact when encrypting and decrypting sensitive data
More Information
* For more information on using SSL see, “How To: Use SSL on a Web Server”.

Do Not Store Passwords Directly in the User Store

The reference implementation uses SHA1 hashing to protect passwords.

How it was Implemented
A cryptographically strong hashing function, such as SHA1, is used to produce the one-way hash from the password. It is computationally infeasible to convert this hash value back into the original password. Furthermore, it is also computationally intensive to find a different password that will result in the same hash code.

Membership Provider uses SHA1 by default to hash passwords for authentication. By using membership provider and configuring the password format, the reference implementation is already secure from this perspective.

Why
Passwords stored in plain text in the user store are vulnerable to being stolen. They can be stolen in the event that the data store is compromised, or by an insider with privileges to access the store or through SQL injection. A safer solution is to store a hash value generated from the original password.

Benefits
Password hashing provides the following benefits in this scenario:
* Makes it more difficult to steal password data if the user store is compromised

Liabilities
Password hashing has some drawbacks:
* It is not possible to retrieve the original password from the hashed value
* There is a marginal performance impact when hashing the password value

Enforce Strong Passwords

The reference implementation enforces strong passwords by default since it uses the SQL membership provider.

How it was Implemented
The reference implementation uses the SQL membership provider, which by default requires that passwords be at least seven characters long and contain at least one non alpha-numeric character.

Why
Users want their passwords to be easy to remember. Consequently, many users will default to weak passwords unless strong password requirements are enforced.

Weak passwords make brute force password guessing attacks easier to conduct. Short passwords and passwords containing only letters and digits limit the search space far enough that it is feasible for an attacker to simply try all combinations in a relatively short period of time, starting with valid English words.

Benefits
* Makes brute force and dictionary attacks prohibitively expensive

Liabilities
Enforcing strong passwords have some drawbacks:
* It is harder for users to remember their passwords

Protect Access to the Credential Store

The reference implementation limits access rights of database users.

How it was Implemented
The MSPetShop4 database created with the aspnet_regdb command is automatically configured with roles that are granted only limited access. By default, only database administrators have complete access to the database.

Following the principle of least privilege, the reference application connects to the database using an account in the aspnetMembershipFullAccess role. Full access is required in order to allow new users to register themselves. This role represents the minimal set of privileges necessary in order to fulfill the needs of the application.

Why
Following the principle of least privilege, preventing unauthorized access to the credential store reduces the risk of credential data being stolen or tampered with.

Benefits
* Prevents unauthorized access or manipulation of the user data

Liabilities
* None

More Information
* For more information on the defined database roles see, Roles and Views in the Application Services Database for SQL Server.


Do Not Persist Authentication Cookies

The reference implementation does not use persistent authentication cookies.

How it was Implemented
In the reference implementation, the login control has RememberMeSet="false" to prevent authentication cookie persistence.

<asp:Login ID="Login" runat="server"
		         RememberMeSet="false" ... 
	

Since the reference implementation is using a layout template that doesn’t include the RememberMe checkbox nothing else is needed. If a layout template were not used then DisplayRememberMe would also have been set to False to ensure the user could not override this setting.

Why
A stolen authentication cookie can be used by an attacker to access an application with another user’s identity. When the server instructs the browser to store a cookie, it can specify whether the cookie is persistent or not. A persistent cookie is stored on disk in the browser cache while a non-persistent cookie is only present in memory. Non-persistent authentication cookies cannot be stolen from the user’s browser cache.

Benefits
* Protects against identity spoofing attacks

Liabilities
Not using persistent authentication cookies has some drawbacks:
* Users must re-authenticate with every session

Set HttpOnly on Authentication Cookies

The reference implementation sets httpOnlyCookies configuration attribute to true.

How it was Implemented
The httpOnlyCookies attribute is set to true in web.config and applies to all cookies in the application.

<httpCookies httpOnlyCookies="true" requireSSL="true" />

Why
One way to steal authentication cookies is with a cross-site scripting attack. To protect against this attack, you can set the httpOnly option for all cookies. This option disables script access to cookies on browsers that support it. Since the reference implementation does not require script access to any cookie, setting this property to true provides protection with very little cost.

Benefits
* Protects the authentication cookie from cross-site scripting attack

Liabilities
* None

Use Unique Cookie Names and Paths
The reference implementation uses unique name and path for authentication cookies.

How it was Implemented
The reference implementation specifies unique values for the authentication cookie name and path, instead of taking the default in web.config (see bolded text below).

<authentication mode="Forms">
		      <forms name="PetShopAuth"
		             [path=”/PetShop”] 
		             loginUrl="~/Restricted/SignIn.aspx" 
		             protection="All" 
		             requireSSL="true" 
		             timeout="30"
		      />
		    </authentication>
	

Why
Ensuring unique names and paths prevents problems that can occur when hosting multiple applications on the same server. For example, if you do not use distinct names, it is possible for a user who is authenticated in one application to make a request to another application without being redirected to that application's logon page.

Benefits
* Prevents unauthorized access to another application in multiple application hosting scenario..

Liabilities
Unique cookie names and paths have some drawbacks in this scenario:
* Paths on cookies can lead to problems due to case sensitivity. Be sure that the URLs in any hyperlinks you create match the case of the path property value

Authorization

Authorization is required for giving permission to authenticated users to carry out specific operations. Authorization guards against unauthorized access and modification of user data such as other user’s information, pricing and product details. The reference implementation does not allow users to modify other user’s data, pricing, products details.

Key engineering decisions:
* Use URL authorization for page and directory access control
* Use ASP.NET role manager for roles authorization
* Disable role caching
* Never use GET requests for write operations
* Set Page.ViewStateUserKey to a unique id for each user.

Use URL Authorization for Page and Directory Access Control

The ASP.NET URL Authorization Module and the SQL Role Provider is used to control user access in the reference implementation.

How it was Implemented
Configure URL Authorization in web.config. The URL authorization module makes it easy to implement authorization, requiring only two sections in the web.config file.

		      <location path="Restricted">
		        <system.web>
		          <authorization>
		            <deny users="?"/> <!--deny anonymous users-->
		          </authorization>
		        </system.web>
		      </location>
	

		      <location path="Admin">
		        <system.web>
		          <authorization>
		            <allow roles="Administrators"/> <!--allow admins-->
		            <deny users="*"/> <!--deny everyone else-->
		          </authorization>
		        </system.web>
	

Why
Configuring access control allows you to specify which users are allowed to access specific pages and resources. Carefully select the pages that go in each path, accidentally placing an administratiev page in the wrong path could open up avenues of attack.

Benefits
* Access control is implemented with minimal effort

Liabilities
URL authorization has some drawbacks:
* File location implies privilege, be careful where you put each page
Use SQL Role Manager for Roles Authorization
The reference implementation uses the SQL Role Manager to store roles in the SQL database and handle user authorization.

How it was Implemented
The following steps explain how roles authorization was implemented for this reference implementation:

1. Configure SQL Role Provider
The SQL Role provider is used to map users onto roles. All of the role data is stored in the same database as the application membership data. The role related schema is created when the –mr switch is used for creating the membership database,
The role provider is configured with the following web.config setting.
<roleManager enabled="true" defaultProvider="SqlRoleManager">
		      <providers>
		        <add name="SqlRoleManager" 
		             type="System.Web.Security.SqlRoleProvider"
		             connectionStringName="SQLMembershipConnString"
		             applicationName=".NET Pet Shop 4.0" />
		      </providers>
	
</roleManager>

2. Create Administrator Role Users
Installation of this reference implementation automatically creates an Administrator role in the ASP.NET role database. The easiest way to add users to this role is to use the ASP.NET web site configuration tool.

'''Why
The SQL Role Provider makes it possible to store user roles in SQL server, a good solution for internet facing forms authentication where end-users do not map to windows accounts and windows groups therefore cannot be used.

'''Benefits
* Privileges can be granted to roles instead of individual users
* No need to write custom code for roles management
* Roles are stored in the SQL database and no mapping to Windows accounts or groups is necessary

'''Liabilities
* None

'''More Information
* For more information on Role Manager, see “How To: Use Role Manager in ASP.NET 2.0”.


Disable Role Caching

The reference implementation disables role caching to protect against elevation of privilege attacks.

How it was Implemented
Set cacheRolesInCookie to false. The role cache is disabled in web.config with the following setting.

<roleManager enabled="true" defaultProvider="SqlRoleManager"
		                 cacheRolesInCookie="false">
	

Why
Role caching is disabled to protect against elevation of privilege attacks. There is little need to cache data in a cookie in the reference implementation because it defines only a single role and role lookups are relatively infrequent.

ASP.NET role management can be configured to cache role membership data in a role cookie to improve performance. When this option is enabled, the value of the role cookie sent by the browser is used to determine role membership. Consequently, if a role cookie is ever stolen, it would allow the attacker to elevate their privileges.

Benefits
Disabling role caching provides the following benefits in this scenario:
* Increased security as there is no role cookie and hence can not to be spoofed

Liabilities
Disabling role caching has some drawbacks:
* Role caching optimizes performance on role lookups. Disabling role caching removes this optimization.


Never Use GET Requests for Write Operations

The reference implementation was carefully developed and written so that write operations will never occur as a result of an HTTP GET. This protects from zero-click elevation of privilege attacks which would have allowed an attacker to assume a valid user’s identity for specific operations.

How it was Implemented
In the reference implementation all write operations are performed using POST requests only. This is the default behavior for ASP.NET. In order to break this rule you must do something custom. For instance, the original version of PetShop v4 had the following code:

<a href="ShoppingCart.aspx?AddItem=<%=itemId%>"/>

Then, in the load event for ShoppingCart.aspx, it had:

string itemId = Request.QueryString"AddItem";
if (itemId != null)
{
		    //Add Item
	
}

This code would have allowed a zero-click attack.

Why
GET requests are sent by the browser to retrieve resources such as pages, images, or other files. However, a GET request can have side effects. For instance a GET request to the hypothetical URL - /MakeUserAdmin.aspx?userId=5 would have the effect of adding the specified user to the administrator role.

Browsers will automatically generate GET requests when processing resource URLs embedded in an HTML page. If an attacker lures an authenticated user to view a page (or HTML mail) with a malicious embedded resource URL the write operation will occur – resulting in an elevation of privilege attack.

Benefits
Never using GET requests for write operations provides the following benefits in this scenario:
* Prevents zero-click elevation of privilege attacks

Liabilities
* None
Set Page.ViewStateUserKey to a Unique ID for Each User
The reference implementation sets the Page.ViewStateUserKey to a unique ID in order to protect against one-click elevation of privilege attacks. One-click attacks can be used as an elevation of privilege attack allowing a malicious user to assume a valid user’s identity for specific operations that are completed in response to a POST request.

How it was Implemented
Each page in the reference implementation that performs write operations sets the ViewStateUserKey setting in the Page_Init handler. Search for “ViewStateUserKey” to see all instances.

if (Request.IsAuthenticated)
{
		    [ViewStateUserKey] = Page.User.Identity.Name;
	
}

Why
ASP.NET has built-in protection for one-click elevation of privilege attacks with the ViewStateUserKey. Setting this value to a unique id for each page allows ASP.NET to detect when this attack is occurring.

To effectively counter one-click attacks, this value must be unique to the user making the request. If session state is enabled then using Session.SessionID will work. However, the reference implementation has session state disabled, so using the current user login name is a good choice.

Benefits
Setting Page.ViewStateUserKey to a unique ID provides the following benefits in this scenario:
* Prevents one-click elevation of privilege attacks

Liabilities
* None

Input and Data Validation

Input is validated to ensure only well-formed input is accepted. Input validation guards against code injection attacks, business logic attacks and denial of service attacks. The reference implementation does not trust input from any source and does not rely on client side validation. Input validation is paired with output encoding to protect against code injection attacks within the application.

Key engineering decisions:
* Do not rely solely on ASP.NET request validation
* Do not rely solely on client-side validation
* Validate input from all sources like QueryString, cookies, hidden fields, HTTP headers and HTML controls
* Properly encode all data from un-trusted sources

Do Not Rely Solely on ASP.NET Request Validation

The reference implementation leaves ASP.NET request validation enabled, but does not rely on it for protection.

How it was Implemented
Request validation is enabled by ASP.NET by default. You can see the following default setting in the Machine.config.comments file.
<pages validateRequest="true" ... />
Each input field contains custom validation, for example:

<asp:RequiredFieldValidator
		    ID="valCCNumber" runat="server" ControlToValidate="txtCCNumber"
		    Display="Dynamic" ErrorMessage="Please enter card number.">
	
</asp:RequiredFieldValidator>

<asp:RegularExpressionValidator ID="valCCNumber1"
		    runat="server" ControlToValidate="txtCCNumber" Display="Dynamic"
		    ErrorMessage="Card number invalid."
		    ValidationExpression="^([0-9]{15,16})$">
	
</asp:RegularExpressionValidator>

Why
ASP.NET request validation blocks common attack strings found in HTTP requests. This helps to protect against some script injection attacks. However, it cannot prevent all attacks. The best defense is to author custom validation for each input field and encode all output.

Benefits
* Protects against code injection attacks, such as SQL Injection and Cross Site Scripting.
* Helps to protect against denial of service attacks
Liabilities
* None

Do Not Rely solely on Client-side Validation

The reference implementation uses server-side validation to protect from attack and client-side validation to improve the user experience.

*How it was Implemented *
The reference implementation uses client-side validation in CreditCardForm.ascx. The page includes some JavaScript which is used to validate the expiration date of the card. However, the same validation also occurs on the server as can be seen in CheckOut.ascx.cs:

// <summary>
// Custom validator to check CC expiration date
// </summary>
protected void ServerValidate(object source, ServerValidateEventArgs value) {
		            [DateTime] dt;
		            if [(DateTime.TryParse(value.Value,] out dt))
		                [value.IsValid] = (dt > [DateTime.Now);]
		            else
		                [value.IsValid] = false;
		        }
	
Why
Client-side validation can be easily circumvented by an attacker. Validation that occurs in client-side script is a useful tool for improving the user experience, but should not be relied upon for validating input.

Benefits
* Unlike client-side validation, it is not easily bypassed

Liabilities
* None

Validate Input From All Sources Like QueryString, Cookies, and HTML Controls

All sources of input in the reference implementation are validated for length, range, format and type. Sources of input include:
* Query String
* Cookies
* HTML controls
* Hidden fields
* HTTP headers

How it was Implemented
The reference implementation only uses white-list validation since it is easier to define the input that is trusted than to define all the possible combinations of bad input.

Validate with the RegularExpressionValidator control
In the reference implementation all input is validated in the presentation layer. Every user input web form control has a corresponding RegularExpressionValidator control that defines the set of good input. This control validates on both the client and server, so it is an ideal solution.

As an example, the following code can be found in the AddressForm.ascx user control. The regular expression control is used to ensure that the txtFirstName value contains only letters, numbers and spaces. It also validates that the length of the string is between 1 and 80 characters long.

<asp:TextBox ID="txtFirstName" runat="server" Columns="30" CssClass="checkoutTextbox"
		                MaxLength="80" Width="155px"></asp:TextBox><br />
	


<asp:RegularExpressionValidator ID="val2FirstName" runat="server ControlToValidate="txtFirstName" CssClass="asterisk" ErrorMessage="Please use only letters, digits, or space." ValidationExpression="^a-zA-Z0-9 {1,80}$" />

There are a few places where data is obtained from the query string or other sources that are not tied directly to a web form control. In these cases, validation is done manually using regular expressions. As an example, the following code in ShoppingCartControl.aspx.cs is used to validate the value of the itemID, which is obtained from the form data of the HTTP request.

if (! WebUtility.validItemId.IsMatch(itemId))
{
		     Response.Redirect("~/Error.aspx");
	
}

The WebUtility class defines a static validItemId regular expression. This is an elegant solution because the expression is only created once and can be reused anywhere the code needs to validate an item id. This expression constrains the input to between 1 and 10 letters, numbers, or dash characters.

public static readonly Regex validItemId =
		       new Regex("^[a-zA-Z0-9-]{1,10}$", [RegexOptions.Compiled);]
	

The use of regular expressions makes it possible to quickly and easily define precisely the expected values for a given input field. Regular expressions are extremely powerful, but also can produce unexpected results. It is important to test all expressions thoroughly for correctness.

More examples in the reference implementation
All instances of input validation have been marked in the reference implementation with special comments. You can quickly spot them all by searching the entire solution (ctrl-shift-f) for “Category: InputValidation”.

Why
Sometimes certain sources of input are assumed to be safe. However, any value that is sent as part of an HTTP request should be considered untrusted data. This includes values from the QueryString, cookies, and HTML controls. Some server variables, such as request.servername, are also under user control and should not be trusted.

Benefits
* Protects against code injection attacks, such as SQL Injection and Cross Site Scripting.
* Helps to protect against denial of service attacks

Liabilities
* Use of Regex has significant performance implications

More Information
* For more information on regular expressions, see .NET Framework Regular Expressions.

Properly Encode All Data From Un-trusted Sources

The reference implementation encodes all untrusted data used in a response.

How it was Implemented
The reference implementation protects against cross-site scripting using the custom WebUtility.HtmlEncode and WebUtility.UrlEncode functions. These functions were created for this application and are used to appropriately encode all data that could contain un-trusted input. ASP.NET’s Server.HtmlEncode is not used as it does not provide adequate protection.

CartList.ascx
The business data returned from the database should not be trusted and is wrapped by a call to HtmlEncode.

Example:

<%# WebUtility.HtmlEncode(DataBinder.GetPropertyValue(Container.DataItem,
"Name")) %>

ItemsControl.ascx
In this instance, the query string data is being written into a URL context. So the UrlEncode method is used.

<a href='Products.aspx?categoryId=<%=WebUtility.UrlEncode(Request.QueryString"categoryId")%>&productId=<%=WebUtility.UrlEncode(Request.QueryString"productId") %>'>

More examples in the reference implementation
In the reference implementation, all instances of output have been encoded. These have been marked with a special comment tag to ease finding them. You can find them all in Visual Studio by searching the entire solution (ctrl-shift-F) for “Category: OutputEncoding”. Some examples are included below for reference.

Why
An application is vulnerable to cross-site scripting (XSS) attacks whenever unconstrained user input is written directly to the response stream with insufficient encoding. This allows an attacker to inject client-side script code into the page, thereby giving access to privileged information, such as authentication cookies or web page content.

The best way to protect against cross-site scripting attacks is to encode all output appropriately based on context. When the output is written into the body of an HTML page, it should be HTML Encoded. If the output is written into a URL, it should URL Encoded. If the output is written into a JavaScript string, it should be JavaScript Encoded. By encoding the output, the browser will always interpret the text as plain text, instead of as HTML.

It is important to note that even data from the products database is not fully trusted. In the reference implementation, this data is populated by setup scripts. However, in a real-world application, this data could originate from many sources. Untrusted data in the database could be used to conduct a cross site scripting attack that replays every time the bad data is retrieved. Since the presentation layer cannot be sure that the data is fully trusted, it encodes all database strings before writing them to the output stream.

ASP.NET’s Server.HtmlEncode function is often used to protect against cross-site scripting attacks. However, this function only encodes <>"& characters. This is not sufficient to protect against all possible attacks. For instance, the following ASP.NET code would be vulnerable.

<img id='img<%=Server.HtmlEncode(Request.QueryString"userId") %>' src='/image.gif' />

An attacker could inject client-side script here by setting userId to
' onload=alert('xss') alt='

The safest solution is to encode all non-alphanumeric characters. Only this type of whitelist solution will catch all possible XSS attacks, regardless of context. This is the approach that the custom WebUtility.HtmlEncode method written for this application takes. It requires more overhead in terms of processing time and size of the resulting HTML, but it is the safest encoding mechanism for all HTML contexts:

public static string HtmlEncode(string x)
{
if (x == null)
{
		      	return x;
	
}
		      return Regex.Replace(x, "[^a-zA-Z0-9]+", new [MatchEvaluator(WebUtility.EncodeMatch));]
	
}

This is also the approach that the Microsoft Anti-Cross Site Scripting Library takes.

Benefits
* Protects against Cross Site Scripting attacks.

Liabilities
Encoding output has some drawbacks:
* There is a small performance impact associated with the encoding

More Information
* For information on the Microsoft Anti-Cross Site Scripting Library, see “Microsoft Anti-Cross Site Scripting Library V1.0 Download”.


Data Access

The reference implementation protects data access information like connection string, user credentials so that attackers don’t use the info to launch attacks against your application. Also it uses least privileged account to access database, so even if the application is compromised it minimizes the impact. The implementation takes care of SQL injection attacks.

Key engineering decisions:
* Encrypt connection strings in web.config
* Use Windows authentication to connect to SQL
* Use least-privileged accounts for database access
* Use a trusted service account instead of constrained delegation
* Use type safe SQL parameters and do not use dynamic queries

Encrypt Connection Strings in Web.config

The reference implementation encrypts connection strings in the web.config


How it was Implemented

Connection strings can be encrypted using the aspnet_regiis.exe utility. It is possible to use either RSA or DPAPI encryption. In a Web farm, use the RSA-protected configuration provider because RSA keys can be exported and imported across servers.

The reference implementation encrypts using DPAPI:
aspnet_regiis.exe -pef "connectionStrings" "C:\Program Files\Microsoft\.NET Pet Shop 4.0\PetShop" -prov "DataProtectionConfigurationProvider"

Why

Since the reference implementation uses Windows authentication to connect to the database, database credentials are not included in the connection string. The connection string is still encrypted, however, in order to protect the server and database names.

Benefits

* Protects the server and database names from being stolen if the web.config file is compromised

Liabilities

* None

Use Windows Authentication to Connect to SQL

The reference implementation use Windows authentication to connect to the SQL database.

How it was Implemented

The connection strings in the reference implementation specify Windows authentication with the integrated security=SSPI; setting.

connectionString="server=.\sqlexpress;database=MSPetShop4Services;integrated security=SSPI;

Why

Windows authentication is a good solution for connecting to SQL because credentials are not specified in the connection string and are not passed over the network.

Benefits
* Credentials are at less risk of being stolen
* Allows you to leverage connection pooling
Liabilities
* None

Use Least-privileged Accounts for Database Access

The reference implementation uses accounts with the least possible privileges when connecting to the database.

How it was Implemented
The reference implementation installer automatically creates a database role named PS4Users. This role is granted privileges to only execute a specific set of stored procedures. The default ASPNET process identity is added as a member of this role. If you want to change the process identity, you will have to add the user to this role manually.

The implementation of this configuration can be found in the <dbname>perms.sql files in the DatabaseScripts\SQL folder. The contents of the MSPetShop4Perms.sql file is included below.

USE MSPetShop4
declare @hostName as varchar(25)
set @hostname = HOST_NAME() + '\ASPNET'
exec sp_addrole 'PS4User'
exec sp_grantlogin @hostname
exec sp_grantdbaccess @hostname
exec sp_addrolemember 'PS4User', @hostname
exec spaddrolemember 'aspnetChangeNotification_ReceiveNotificationsOnlyAccess', @hostname
GRANT EXECUTE ON CreateSupplier TO PS4User
GRANT EXECUTE ON ReadCategories TO PS4User
GRANT EXECUTE ON ReadCategory TO PS4User
GRANT EXECUTE ON ReadInventory TO PS4User
GRANT EXECUTE ON ReadItem TO PS4User
GRANT EXECUTE ON ReadItemsByProduct TO PS4User
GRANT EXECUTE ON ReadProduct TO PS4User
GRANT EXECUTE ON ReadSupplier TO PS4User
GRANT EXECUTE ON SelectProductsByCategory TO PS4User
GRANT EXECUTE ON SelectProductsBySearch TO PS4User
GRANT EXECUTE ON UpdateInventoryQty TO PS4User
GRANT EXECUTE ON UpdateSupplier TO PS4User

Why
It is important to connect to the database in the context of an account with the fewest privileges necessary to meet the needs of the application. For instance, the Pet Shop 4 application does not need to create or delete tables or run extended stored procedures.

The ideal solution is to run as a user with access only to a limited set of stored procedures that are created specifically to meet the needs of the application. Direct access to the tables themselves should be denied. With this solution, if the middle tier is ever compromised, the database damage is limited to only operations defined by the stored procedures.

Benefits
Using least-privilege accounts for database access provides the following benefits in this scenario:
* Reduces impact of SQL Injection attacks
* Reduces impact if the middle tier is compromised in any other way

Liabilities
* None

Use a Trusted Service Account Instead of Constrained Delegation.

The reference implementation uses a trusted service account when connecting to the database.

How it was Implemented
The reference implementation does not use impersonation, so it is automatically connecting using a trusted service account. By default this is the ASP.NET process identity.

In the constrained delegation scenario the application would connect to the database using the identity of the end-user making the HTTP request rather than the ASP.NET process identity. This scenario only applies when using impersonation.

Why
Using a trusted service account is a scalable solution because it allows the SQL Client to make more efficient use of connection pooling. It also does not require granting special privileges to the ASP.NET process to enable constrained delegation.

Benefits
* Improves scalability

Liabilities
* Windows auditing to track user access to back-end resources,
* To implement fine-grained access controls to resources (such as databases) on a per-user basis.

Use Type Safe SQL Parameters and Do Not Use Dynamic Queries

In the reference implementation, all database access is performed through stored procedures whenever possible, and type safe parameters are always used.

How it was Implemented
Dynamic queries composed of user input are avoided in the code as well as in the stored procedures in the database layer. For example:

//Create a parameter
SqlParameter parm = new SqlParameter(PARMCATEGORYID,
		                                     [SqlDbType.VarChar,] 10);
	
//Bind the parameter
parm.Value = categoryId;

using (SqlDataReader rdr =
		       	[SqlHelper.ExecuteReader(]
		                	[SqlHelper.ConnectionStringLocalTransaction,] 
		                	[CommandType.StoredProcedure,] 
		                	SQL_SELECT_CATEGORY, parm)) 
	
{
		    		if (rdr.Read())
		       		category = new [CategoryInfo(rdr.GetString(0),] [rdr.GetString(1),]
		                                   [rdr.GetString(2));]
		    		else
		       		category = new [CategoryInfo();]
	
}

Why
With the careful use of stored procedures, the reference implementation is well protected from SQL injection attacks.

Benefits
* Protects against SQL injection attacks
* Better performance with stored procs

Liabilities
* None

Exception Management
Correct exception handling in your Web pages prevents sensitive exception details from being revealed to the user, improves application robustness, and helps avoid leaving your application in an inconsistent state in the event of errors. The reference implementation I,ple,emts structured exception handling and does not reveal exception details. For any unhandled exception it has implemented a global error handler.

Key engineering decisions:
* Use structured exception handling
* Do not reveal exception details to the client
* Use a global error handler to catch unhandled exceptions

Use Structured Exception Handling

The reference implementation uses structured exception handling throughout.

How it was Implemented
The reference implementation includes try/catch/finally constructs around all database calls. These blocks ensure that connections are properly closed even in the event of a database error.

It is important to note that the C# using construct implicitly includes a finally clause that calls the object’s dispose method. This makes the code very easy to read. An example of this construct can be found in the SQLHelper.ExecuteNonQuery method.

using (SqlConnection conn = new SqlConnection(connectionString)) {
		     		[PrepareCommand(cmd,] conn, null, cmdType, cmdText,
		                    commandParameters);
		     		int val = [cmd.ExecuteNonQuery();]
		     		cmd.Parameters.Clear();
		     		return val;
	
}

Another use of exception handling in the reference implementation is to implement transactions. In the case of a database error, it is important to roll back the transaction to maintain data integrity. An example of this can be found in the PetShopProfileProvider.SetAccountInfo method.

		   try {
	
SqlHelper.ExecuteNonQuery(trans, CommandType.StoredProcedure,
		                                sqlDelete, param);
	
SqlHelper.ExecuteNonQuery(trans, CommandType.StoredProcedure,
		                                sqlInsert, parms);
	
trans.Commit();
}
catch(Exception e) {
trans.Rollback();
throw new ApplicationException(e.Message);
}
finally {
conn.Close();
}
Why
The most important reason to handle exceptions is to properly dispose of unmanaged resources, such as database connections. Failure to dispose of unmanaged resources can make the application vulnerable to denial of service attacks. Another important reason for exception handling is to prevent information disclosure vulnerabilities. This is discussed in the following sections.

Benefits
* Protects against denial of service attack by disposal of unmanaged resources
* Prevents information disclosure

Liabilities
* Exception handling has a minor performance impact since it sets up a new stack frame

Do Not Reveal Exception Details to the Client

The reference implementation does not reveal exception details to the client.

How it was Implemented
ASP.NET allows you to define a custom error page to deal with unhandled exceptions. This is configured in the web.config for the reference application with the following tag:

<customErrors defaultRedirect="Error.aspx" mode="RemoteOnly"/>

With this tag, ASP.NET will automatically redirect the client to the Error.aspx page whenever an unhandled exception occurs while processing a request from a remote client. If the request originates from the server machine itself, then the resulting error includes many details useful to the developer, including the call stack and exception details. This is a good setting to use during development. Before the application is deployed, this setting should change to mode="On"

The Error.aspx page provides no error details, only contact information.

Why
Exception objects contain detailed information about the application. This information can be used by an attacker to learn more about the application and aid in future attacks.

Benefits
* Deprives attackers of implementation details that could be used to improve attacks.

Liabilities
* None

Use a Global Error Handler to Catch Unhandled Exceptions

The reference implementation uses a global error handler to catch any exceptions that were not explicitly handled.

How it was Implemented
The default ASP.NET 2.0 functionality automatically logs all unhandled exception data into the application event log using the HealthMonitoring feature.

Why
Unhandled exceptions will cause the application to crash, interrupting service to the user and potentially providing implementation details to an attacker.

Benefits
* Improves application stability.
* Deprives attackers of implementation details that could be used to improve attacks.

Liabilities
* None

Sensitive Data
Sensitive data needs to be protected in persistent storage, in memory, and while it is on the network. Where possible, look for opportunities to avoid storing sensitive data. The reference implementation encrypts the sensitive data such as credit card info in the database and protects it over wire using SSL.

Key engineering decisions:
* Encrypt credit card data in the database
* Protect credit card data over the wire using SSL
* Do not store credit card data in ViewState
* Do not cache credit card data

Encrypt Credit Card Data in the Database

The reference implementation encrypts card data before storing it in the database.

How it was Implemented
The reference implementation encrypts the credit card data using asymmetric encryption before storing it in the database. The following code shows implementation details:

public void Insert(OrderInfo order) {

		   		// BAT:
		   		// 
		   		// Category: Cryptography
		   		//
		   		// Description: Ensure the credit card data is encrypted
		                   before storing it in the database.
		   		[order.CreditCard.Encrypt();]
	

		  		 // Insert the order (a)synchrounously based on configuration
		   		[orderInsertStrategy.Insert(order);]
	
}

public void Encrypt()
{
		    		if(encryptor == null)
		    		{
		        			throw new [ApplicationException(]
		               		"Could not get public encryption key");
		    	}
	

		    //First convert the card number to bytes          
		    byte[] cardData = [Encoding.UTF8.GetBytes(cardNumber);]
	

		    //Use the encryptor object to perform encryption
		    byte[] encryptedData = encryptor.Encrypt(cardData, true);
	

		    //Convert back to string with Base64Encoding
		    //Base64 is used as a robust way to encode an arbitrary bit
		    //sequence as a string. It has no value from a security
		    perspective.
		    cardNumber = [System.Convert.ToBase64String(encryptedData);]
		 }
		 static [CreditCardInfo()]
		 {
	

		    //Load the [RSACryptoServiceProvider] using the public key
		    //specified in the application configuration file.
		    string keyFile = ConfigurationManager.AppSettings["PublicKeyFile"];
	

		    if [(HttpContext.Current] != null && ! [Path.IsPathRooted(keyFile))]
		    {
		        keyFile = HttpContext.Current.Server.MapPath("~/" + keyFile);
		    }
	

		    if(keyFile != null && File.Exists(keyFile))
		    {
		        string [publicKeyXml;]
		        using [(StreamReader] [keyFileIn] = [File.OpenText(keyFile))]
		        {
		            [publicKeyXml] = [keyFileIn.ReadToEnd();]
		        }
		        encryptor = new [RSACryptoServiceProvider();]
		        [encryptor.FromXmlString(publicKeyXml);]
		    }
		 }
	

The public key data is stored in the web application root in a file named PublicKey.xml.config. Write access to the public key file should be restricted and file auditing should be turned on for this file.

The private key file is stored in the private directory for reference purposes only. In a real-world situation, the private key would not be located on the web server. Ideally, it would be stored on a smart-card or some other external secure storage device on the backend server. The credit card data is only as secure as the private key.

Why
Encrypting the credit card data protects it if it was stolen from the database. Theft could occur via a SQL injection attack or if the entire database server itself is compromised. Asymmetric encryption is used in this scenario because the application never needs to decrypt the credit card data. This implementation has a couple of advantages:
* The credit card data is safe even if the web server or database server is compromised. Retrieving the data requires the private key, which is available only on the back-end order processing systems. (Note: There is no back-end order processing system included with the reference implementation.)

* If symmetric encryption were used, extra care would have to be taken to store the encryption key on the application server because it could also be used to decrypt the data. By using asymmetric encryption, the attack surface for stealing the credit card data is significantly reduced.

The public key data is stored in the web application root in a file named PublicKey.xml.config. The config extension was used because ASP.NET is configured by default to block downloads of config files. There is no obvious harm in allowing access to the public key data, but it is still a good idea to restrict access.

Even with this configuration, the application is vulnerable to insider attacks. If someone overwrites the public key file with their own key, the encrypted data can then be easily decrypted by them. Protection from this threat would require using a certificate instead of a public key file and taking programmatic steps to verify that the identity of the certificate is well known. With the current implementation, write access to the public key file should be restricted and file auditing should be turned on for this file.

Benefits
Encrypting credit card data in the database provides the following benefits in this scenario:
* Protects credit card information if stolen from the database.

Liabilities
Encrypting credit card data has some liabilities:
* There is a performance impact associated with encrypting and decrypting the data.
Protect Credit Card Data Over the Wire Using SSL
The reference implementation uses SSL to protect credit card data when it is transmitted.

How it was Implemented
The CheckOut.aspx page includes the form that requests credit card data. This page is located in the “Restricted” folder, which is configured to require SSL communication. As an added precaution, the Page_Init event handler includes a check to ensure an SSL link is being used.

		 if (!Request.IsSecureConnection)
		 {
		    [Response.Redirect(WebUtility.RestrictedUrl(Request.Url));]
		 }
	

Why
Using SSL to transmit credit card data is necessary to prevent attackers from viewing this data as it is sent between the client and server. The check in the Page_Init handler helps enforce this requirement as the application evolves.

Benefits
Using SSL to transmit credit card data has the following benefits in this scenario:
* Protects credit card data from theft when being transmitted to and from the client machine.

Liabilities
Using SSL has the following drawbacks:
* There is a performance impact associated with encrypting and decrypting the data.
Do Not Store Credit Card Data in ViewState
The reference implementation does not store credit card data in ViewState.

How it was Implemented
In CheckOut.aspx, the credit card number, date, and type text input controls are all marked with EnableViewState="false". This prevents the form values from being stored in ViewState. The credit card number TextBox control is included below as an example.

<asp:TextBox ID="txtCcnumber" runat="server" Width="145px" CssClass="checkoutTextbox" EnableViewState="false"></asp:TextBox>

Why
ViewState is not ideally designed for storing sensitive data. If credit card numbers were placed in ViewState they would be visible to attackers when transmitted between the client machine and the ASP.NET application on the server. On the other hand, if ViewState is encrypted its size grows exponentially, resulting in a significant performance impact.

A consequence of this decision is that the credit card values will be reset if the user clicks the “Previous” button in the checkout wizard. This is not a typical user scenario, so the tradeoff is reasonable.

Benefits
Keeping credit card data out of ViewState has the following benefits in this scenario:
* Protects credit card data from theft when transmitted between the client machine and the server.

Liabilities
Keeping credit card data out of ViewState has some drawbacks:
* If the user clicks “Previous” button in the checkout wizard they will have to re-enter their credit card value in the form.

Do Not Cache Credit Card Data

The reference implementation does not cache credit card data.

How it was Implemented
The CheckOut.aspx page includes the following code to disable all caching.

<%@ OutputCache Location="None" VaryByParam="none" %>

Why
The default cache setting is private, which instructs the browser to save a copy of the page in its local disk cache. This exposes the credit card data to attacks on the end user’s machine. By setting the Location value to none, the page will not be cached in the browser cache, further reducing the attack surface.

Benefits
Keeping credit card data out of the cache has the following benefits in this scenario:
* Protects credit card data from being stolen off of the client machine. This is especially important in shared user scenarios – such as internet kiosks.

Liabilities
* None

Auditing and Logging

You should audit and log activity across the tiers of your application. Using logs, you can detect suspicious-looking activity. This frequently provides early indications of a full-blown attack and the logs help address the repudiation threat where users deny their actions. The reference implementation uses health monitoring feature to log important security events.

Key engineering decisions:
* Use health monitoring to log important security events


Use Health Monitoring to Log Important Security Events

The reference implementation uses health monitoring to log security events.

How it was Implemented
The ASP.NET health monitoring feature is used to track each of the following actions in the reference implementation:
* Creation of new users. Helps to track issues where a single user creates too many accounts.
* Failed login attempts. Helps to track brute force attacks on the application.
* Admin operations. Helps to track inappropriate modification to the store data.
* Product purchases. Helps to track down improper purchases.

The reference implementation uses the ASPNET SQL event database to store the event data.
Install the Web event database by running the following command from the Visual Studio 2005 command prompt:

aspnet_regsql.exe -E -S <ServerName> -A w

The relevant configuration settings are shown below.

<healthMonitoring heartbeatInterval="0" enabled="true">
		      <providers>
		        <add connectionStringName="SQLMembershipConnString" 
		           maxEventDetailsLength="1073741823" 
		           buffer="true" 
		           bufferMode="Notification"  
		           name="PS4SqlWebEventProvider"  
		           type="System.Web.Management.SqlWebEventProvider,System.Web,Version=2.0.0.0,Culture=neutral,PublicKeyToken=b03f5f7f11d50a3a" />
		      </providers>
	
Using SQL is a good choice because the event data is more easily accessed by administrators and the configuration easily supports a web farm scenario. Also, there is very little overhead associated with this choice because the SqlWebEventProvider class stores the event data in the same database as the membership and role data.

To further improve security events should be logged to a separate, protected server. This protects the database server from denial of service attacks and protects the logs from unauthorized views or tampering.

Implementation for each action is discussed in the following steps.

1. Audit new user creation
The reference implementation includes a new event class named PetShop.WebEvents.AccountCreatedEvent. This type inherits from the System.Web.Management.WebAuditSuccessEvent because it is used to audit successful creation of user accounts.

The purpose of this class is to allow administrators to log this specific event without also logging other generic WebAuditSuccessEvents. As such, the class inherits all functionality from the base class. This functionality includes logging the request IP, which serves our goal of tracking the scenario where a single user creates many accounts.

The web.config file includes the settings that control where and how often these events are reported.

<healthMonitoring heartbeatInterval="0" enabled="true">

<eventMappings>
		        <add name="New User Event" 
		             type="PetShop.WebEvents.AccountCreatedEvent"/>
	

<rules>

<add name="New User Default" eventName="New User Event" provider="PS4SqlWebEventProvider"
		             profile="Default" minInstances="1" maxLimit="Infinite" 
		             minInterval="00:00:10" custom="" /> 
	

The NewUser.aspx page includes the code that raises this event. It registers a handler for the CreateUserWizard control’s OnCreatedUser event that creates the PetShop.WebEvents.AccountCreatedEvent class, passing the new username as part of the event message.

AccountCreatedEvent acctEvent =
		       new AccountCreatedEvent("New user account: " +
		                                [this.CreateUserWizard.UserName,]
		                                [this.CreateUserWizard,] 
		                                 [AccountCreatedEvent.Code);]
		        acctEvent.Raise();
	

2. Audit failed login attempts
ASP.NET by default audits failed login attempts into the application event log. The reference implementation modifies this configuration so that all WebAuditFailureEvents are logged instead to the ASP.NET event database.

<rules>
		        <clear/>
		        <add name="Failure Audits Default" eventName="Failure Audits"
		             provider="PS4SqlWebEventProvider" profile="Default"
		             minInstances="1"
		             maxLimit="Infinite" minInterval="00:01:00" custom="" />
	

3. Audit all admin operations
The reference implementation includes a new event class named PetShop.WebEvents.AdminOperationEvent. This type inherits from the System.Web.Management.WebAuditSuccessEvent because it is used to audit successful administrative operations.

The only purpose of this class is to allow administrators to log this specific event without also logging other generic WebAuditSuccessEvents. As such, the class inherits all functionality from the base class. This functionality includes logging the identity of the user performing the request, which improves the accountability of store data modifications.


The web.config file includes the settings that control where and how often these events are reported.

<healthMonitoring heartbeatInterval="0" enabled="true">

<eventMappings>
		        …
		        <add name="Admin Op Event" 
		             type="PetShop.WebEvents.AdminOperationEvent"/>
		      </eventMappings>
	

<rules>

		    <add name="Admin Op Default" eventName="Admin Op Event"
		         provider="PS4SqlWebEventProvider"
		          profile="Default" minInstances="1" maxLimit="Infinite" 
		          minInterval="00:00:10" custom="" />
	
</rules>

The Admin/Suppliers.aspx page includes the code that raises this event. It registers event handlers for the ObjectDataSource control’s OnInsertedDeleted events. These handlers all create an instance of the PetShop.WebEvents.AdminOperationEvent class, passing the operation type as part of the event message.

AdminOperationEvent adminOp =
		   new AdminOperationEvent("Supplier Updated", 
		                           sender, [AdminOperationEvent.UpdateCode);]
	

adminOp.Raise();

Why
Logging important security events with Health Monitoring helps you detect and react to suspicious activity. Logging can give you notification when an attack is occurring and then gives you enough information to stop the attack, determine impact, and mitigate the exposed vulnerability.
Benefits

Using Health Monitoring to log important security events provides the following benefits in this scenario:
* Detection of suspicious activity.
* Determine impact of an attack.
* Information to mitigate vulnerabilities exposed by an attack.

Liabilities
* Log files must be protected to keep application or user details away from attackers.
* Excessive logging, or attacks through the logging mechanism, can result in denial of service or lost logs.

More Information
See the following resources for more information:
* For more information on Health Monitoring see, How To: Use Health Monitoring in ASP.NET 2.0.

Companion Guidance

The following companion guidance is in the sequence that is referenced by this document. This is useful if you want to print the documents and refer to them in order.
* How To: Use Forms Authentication with SQL Server in ASP.NET 2.0
* How To: Use Membership in ASP.NET 2.0
* How To: Use SSL to Secure Communication with SQL Server
* Roles and Views in the Application Services Database for SQL Server
* How To: Use Role Manager in ASP.NET 2.0
* .NET Framework Regular Expressions
* Microsoft Anti-Cross Site Scripting Library V1.0
* How To: Use Health Monitoring in ASP.NET 2.0

Additional Resources

* Scenario and Solution: Forms Auth to SQL, Roles in SQL

Contributors

This guidance was completed thanks to the help of the following contributors:

Microsoft
* Nobuyuki Akama for valuable insight into real world scenarios
* Prashant Bansode for a thorough review and scrubbing of the content
* Denny Dayton for encouraging us to expose our thinking
* Jaquelyn Hiltz for helping us improve usability and readability
* Shawn Veney for reviewing and giving accurate feedback

Others
* Andy Eunson for feedback on organization and usefulness in the real world
* Alex Mackman for screening technical accuracy
* Rudy Araujo for helping us get all the details right
* Alik Levin for feedback on where we can take this next