This post has been republished via RSS; it originally appeared at: Microsoft Developer Blogs - Feed.
In this multi-part series, App Dev Manager Jon Guerin and Premier Consultant Daniel Taylor (@dbabulldog) lays out some best practices around setting up and using In-Memory OLTP.Recently a co-worker and I were approach by a team requesting assistance with In-Memory OLTP for SQL Server 2016. As we spoke, the team was interested in best practices around setting up and using In-Memory OLTP. I started my journey of searching the archives of the internet, my notes, and training archives and surprisingly there was not one area that yielded this information. As the In-Memory solution adoption continues to increase we thought it would be fitting to put together a best practices cheat sheet to help drive out conversations with our clients. What we are presenting here is a living document and one we believe will grow over time with additions to the solution and input from the community. We will be breaking up this post into a two-part series. The best practices / cheat sheet has been broken up into the following areas:
- Part one of the series (focus of this post)
- Understanding the requirements for In-Memory OLTP features
- Usage Scenarios
- Planning
- Schema Durability
- Natively compiled Stored Procedures
- Part two of the series (focus of the next post)
- Indexing
- Memory Management
- Performance Considerations
- Backup and Recovery
- Monitoring
Understanding the requirements for In-Memory OLTP features:
- Follow hardware and software for installing SQL Server documentation.
- Understanding additional requirements for Memory-Optimized tables
- Available on SQL Server 64-bit versions
- SQL Server 2016 SP1 (or later), and edition
- SQL Server 2016 pre-sp1 (Enterprise or Developer)
- SQL Server 2014 (Enterprise or Developer)
- Enough memory to hold
- data in-memory optimized tables and indexes
- additional memory to support online workload
- Ensure enough memory resources are available when running SQL Server in a VM
- Memory reservation
- Dynamic Memory, Minimum RAM
- Enough free disk space (2 to 4 times the durable memory-optimized table)
- Processor that supports instruction cmpxchg16b
- Requirements for using memory-optimized tables
- Available on SQL Server 64-bit versions
Usage Scenarios:
Determine what issue you are solving. Does it fit into some of the examples provided? If yes, moving to the planning phase is the next step recommendation.- High-throughput and low-latency transaction processing
- Data ingestion, including IOT
- Ingest large volumes of data from different sources
- Common patterns
- Including ingesting sensor reads and events
- Managing batch updates from multiple sources, minimizing impact on concurrent read workload
- Implementation considerations when using this pattern
- Improved batch performance and Ingestion case studies
- Caching and session state
- Really attractive for session state (ASP.NET application) and caching
- Gaming site and mid-tier use cases for in-memory OLTP
- TempDB Object replacement
- Leverage to replace temp tables, table variables, and table variable functions
- Utilize if you are looking to reduce temp object CPU and log IO
- Note Memory-Optimized TempDB Metadata can be leveraged to reduce TempDB metadata contention
- TVP for IOT ingestion and swapping tables in TempDB use cases
- ETL
- Using non-durable memory-optimized tables for data staging
- Use natively compiled stored procedures to do transformations on your staging tables
- Natively compiled stored procedures will benefit more for processes requiring more clock cycles
- Natively complied store procs with reduce T-SQL surface area
- Note of caution. When loading data to in-memory tables it is single threaded and you will need to get creative with your load process, SSIS packages, or BCP files
Planning:
Utilize the transaction performance analysis reports to determine if the table and or stored procedure should be ported into In-Memory OLTP.- Run Transaction Performance report against representative production workload to yield
- Analyze workload determining hot spots where in-memory could potentially help
- Helps plan and execute the migration to In-memory OLTP identifying incompatibilities and potential impact
- Review Transaction Management reports analysis
- Tables
- Scan statistics
- Looking for high percent of total accesses. The higher the percentage indicates higher table utilization which could yield benefits from In-Memory OLTP.
- Examine lookup and ranged scan statistics yields possible gains by converting to In-Memory OLTP due to its optimistic concurrency design.
- Contention Statistics
- Examine percent of total waits, latch statistics, and lock statistics
- Tables with high percentage in any of these categories could yield significant performance gains by migrating to In-Memory OLTP tables.
- Migration Difficulties
- Provides a difficulty rating
- Compare these ratings to the objects identified in the scan and contention statistics to help rank which tables should be ported to In-Memory.
- Should Table be Ported
- Scan statistics
- Stored Procedures
- Execution Statistics
- Table Refences
- Should a Stored Procedure be Ported?
- Once artifacts are identified for the tables and stored procedure reports generate an In-Memory OLTP checklist.
- Generates a checklist of any table or stored procedure not supported
- How to generate In-Memory OLTP Migration checklist
- Tables
- Additional In-Memory Planning considerations
- Table must be offline to covert
- Examine the exhaustive list for Unsupported features
- Database Snapshots are not supported
- Cross database Transactions are not supported
- Readpast table hint not supported
- Rowversion, Sequence are not supported
- Details on additional In-memory planning and considerations
- For those that prefer the T-SQL approach Ned Otter a strong advocate for in memory has created a Migrating Table to In-Memory OLTP script
- When considering In-Memory examine your workload and identify how could it benefit by using in-memory temp tables, table variables or functions, and not just your core tables and procedures.
Schema Durability:
- Schema Only
- Durability SCHEMA_ONLY
- Deploy if you want to maintain only the schema of the in-memory table
- Key to remember is this data is not maintained after a service restart or database goes offline
- Use case examples
- Get around the use of temp tables
- Store temporary data for ETL processes / staging tables for data warehouses
- Logging tables
- Schema and Data
- Durability SCHEMA_AND_DATA
- Primary Key is required for Durable In-memory table
- Deploy if your requirement is to maintain the schema and data of the in-memory table
- Note this will impact/increase recovery time of the database (recovery recommendations found in backup and recovery section)
- Use case examples
- Tables that require high throughput and low latency
- Starting in 2016 automatic update of statistics is supported for in-memory tables
Natively compiled Stored Procedures:
- Used for business-critical parts of application and frequently executed
- Frequently executed
- Compiled into machine code and stored as a DLL
- All parameters passed are considered to have unknown values (uses statistical data for optimization)
- Parameter sniffing is not used
- Natively compiled stored procedures do support the optimized for hint
- Recommendation is to start with default optimization of unknown
- Natively compiled stored procs are not recompiled when statistics change
- The table should contain a representative set of data and statistics before the procedure Is created. Guidelines for deploying tables and procedures.
- In general, one can expect better performance from native compiled stored procs
- Will shine for high concurrent workloads
- As the number of rows increase
- Use of the following should exhibit better performance
- Aggregation, Nested-loop Joins
- Multi-statement selects or DML operations
- Complex expressions, procedural logic
- For the full list of supported constructs in natively compiled stored procedures
- For the list of constructs not supported by In-memory OLTP
- Indexing
- Memory Management
- Performance Considerations
- Backup and Recovery
- Monitoring