What is Elastic Pool: As the name suggests it is a pool of SqlAzure Dbs.
- Each elastic pool contains sets to SqlAzure dbs.
- Elastic pool has shared capacity among SqlAzure dbs.
- Elastic pool ensures best use of capacity and resource.
- It is very much cost effective also.
- Best option for dealing with unexpected db load at same time remaining cost effective.
- While creating elastic pool, Azure provides suggestion about which db to include in the pool and about min mac DTU, bases on DTU usages of Dbs.
Example: Let’s assume you have 10 basic SqlAzure dbs, which normally use 5 DTUs. During setup in Azure you assign each Db 10 DTU (considering scaling\peak) i.e total 100 DTU while most often consumption is 50 DTUs on average. So, you are paying for 50 DTU extra i.e double.
By putting then in elastic pool and setting the max DTU at a level (for ex: 60), you can handle unexpected db load and achieve cost effective solution.
P.S: In the above example, there is an assumption that all dbs don’t achieve peak at the same time.
Elastic Job: Elastic Job is an Azure feature. It is a db job which executes T-SQL scripts across multiple databases while automatically retrying and providing eventual completion guarantees. To manage scaled-out sharded databases, elastic job executes script across dbs including
- a custom-defined collection of databases.
- all databases in an elastic pool
- a shard set (created using Elastic Database client library).
With jobs, you can easily and reliably manage large groups of Azure SQL Databases by running Transact-SQL scripts to perform administrative operations.
Why use jobs: Normally, you must connect to each database to execute scripts. Jobs run the script automatically across the databases with automatic retry, when job fails. Following are the main benefits
- Less human intervention
- Reduced overhead
- Ease to manage
- Automatic retry mechanism
- Jobs can be scheduled
How Elastic Database jobs work: There are multiple types of job tasks that carry out execution of job tasks:
- ShardMapRefresh: Queries the shard map to determine all the databases used as shards
- ScriptSplit: Splits the script across ‘GO’ statement into batches
- ExpandJob: Creates child job for each database from a job that target a group of databases.
- ScriptExcecution: Executes the script on each database using defined credentials.
- Dacpac: Applies DACPAC to a particular database using defined credentials.
Further read: To know more please see the following links.