DP-201: Designing an Azure Data Solution

Data professionals capture and analyze exponential amounts of data

Log in to Enroll

Summary

  • intermediate
  • azure
  • others
  • azure-cosmos-db
  • azure-data-lake
  • azure-sql-data-warehouse
  • azure-databricks
  • azure-notebooks
  • azure-stream-analytics
  • azure-sql-database
  • azure-storage
  • azure-monitor
  • azure-hdinsight
  • In response to the coronavirus (COVID-19) situation, Microsoft is implementing several temporary changes to our training and certification program. Learn more.The content of this exam was updated on July 31, 2020. Please download the skills measured document below to see what changed.Candidates for this exam are Microsoft Azure data engineers who collaborate with business stakeholders to identify and meet the data requirements to design data solutions that use Azure data services.Azure data engineers are responsible for data-related design tasks that include designing Azure data storage solutions that use relational and non-relational data stores, batch and real-time data processing solutions, and data security and compliance solutions.Candidates for this exam must design data solutions that use the following Azure services: Azure Cosmos DB, Azure Synapse Analytics, Azure Data Lake Storage, Azure Data Factory, Azure Stream Analytics, Azure Databricks, and Azure Blob storage.

Learning paths

10 hr 32 min
Data engineering with Azure Databricks

Modules in this learning path

  • Process streaming data with Azure Databricks structured streaming
    7 Units
    46 min

    Process streaming data with Azure Databricks structured streaming

  • Build and query a Delta Lake
    7 Units
    43 min

    Build and query a Delta Lake

  • Describe Azure Databricks best practices
    8 Units
    38 min

    Describe Azure Databricks best practices

  • Work with DataFrames advanced methods in Azure Databricks
    6 Units
    42 min

    Work with DataFrames advanced methods in Azure Databricks

  • Describe Azure Databricks Delta Lake architecture
    5 Units
    33 min

    Describe Azure Databricks Delta Lake architecture

  • Work with DataFrames in Azure Databricks
    7 Units
    46 min

    Work with DataFrames in Azure Databricks

  • Describe Azure Databricks
    7 Units
    53 min

    Describe Azure Databricks

  • Describe lazy evaluation and other performance features in Azure Databricks
    7 Units
    38 min

    Describe lazy evaluation and other performance features in Azure Databricks

  • Describe platform architecture, security, and data protection in Azure Databricks
    9 Units
    1 hr 3 min

    Describe platform architecture, security, and data protection in Azure Databricks

  • Create production workloads on Azure Databricks with Azure Data Factory
    5 Units
    33 min

    Create production workloads on Azure Databricks with Azure Data Factory

  • Work with DataFrames columns in Azure Databricks
    6 Units
    43 min

    Work with DataFrames columns in Azure Databricks

  • Integrate Azure Databricks with other Azure services
    5 Units
    33 min

    Integrate Azure Databricks with other Azure services

  • Implement CI/CD with Azure DevOps
    5 Units
    28 min

    Implement CI/CD with Azure DevOps

  • Read and write data in Azure Databricks
    9 Units
    1 hr 0 min

    Read and write data in Azure Databricks

  • Spark architecture fundamentals
    5 Units
    33 min

    Spark architecture fundamentals

1 hr 51 min
Explore how the world of data has evolved and how the advent of cloud technologies is providing new opportunities for business to explore. You will learn the various data platform technologies that are available, and how a Data Engineer can take advantage of this technology to an organization benefit.

Modules in this learning path

  • Understand the evolving world of data
    6 Units
    28 min

    Learn how data systems are evolving and how the changes affect data professionals. Explore the differences between on-premises and cloud data solutions, and consider sample business cases that apply cloud technologies.

  • Survey the services on the Azure Data platform
    11 Units
    58 min

    Learn about Azure technologies that analyze text and images and relational, nonrelational, or streaming data. See how data engineers can choose the technologies that meet their business needs and scale to meet demand securely.

  • Identify the tasks of a data engineer in a cloud-hosted architecture
    5 Units
    25 min

    Learn about the responsibilities of a data engineer. Find out how they relate to the jobs of other data and AI professionals. Explore common data engineering practices and a high-level architecting process for a data-engineering project.

3 hr 50 min
Azure provides a variety of ways to store data: unstructured, archival, relational, and more. Learn the basics of storage management in Azure, how to create a Storage Account, and how to choose the right model for the data you want to store in the cloud.

Modules in this learning path

  • Choose a data storage approach in Azure
    6 Units
    30 min

    Learn how using Azure Storage, Azure SQL Database, and Azure Cosmos DB – or a combination of them – for your business scenario is the best way to get the most performant solution.

  • Create an Azure Storage account
    7 Units
    30 min

    Create an Azure Storage account with the correct options for your business needs.

  • Connect an app to Azure Storage
    12 Units
    1 hr 15 min

    Create a simple application and add configuration, client library references, and code to connect it to Azure Storage.

  • Secure your Azure Storage account
    8 Units
    45 min

    Learn how Azure Storage provides multilayered security to protect your data. Find out how to use access keys, to secure networks, and to use Advanced Threat Protection to proactively monitor your system.

  • Store application data with Azure Blob storage
    8 Units
    50 min

    Build an app that stores user files with Azure Blob storage.

4 hr 10 min
Azure supports several popular SQL-based database solutions including SQL Server, PostgreSQL, and MySQL. Learn how to use these enterprise data solutions in Azure to store and retrieve your app's data in the cloud.

Modules in this learning path

  • Migrate your relational data stored in SQL Server to Azure SQL Database
    8 Units
    1 hr 24 min

    Plan and implement the migration of relational data from SQL Server to Azure SQL Database. Select the appropriate migration strategy for your database, and then use Data Migration Assistant to do the assessment and the migration.

  • Secure your Azure SQL Database
    8 Units
    1 hr 7 min

    Secure your Azure SQL Database to keep your data secure and diagnose potential security concerns as they happen.

  • Provision an Azure SQL database to store application data
    5 Units
    50 min

    Use the Azure Portal and Cloud Shell to provision, connect, and manage Azure SQL databases.

  • Create an Azure Database for PostgreSQL server
    7 Units
    44 min

    Azure Database for PostgreSQL provides fully managed, enterprise-ready community PostgreSQL database as a service. Learn how to create and deploy this service and connect it to your applications.

  • Scale multiple Azure SQL Databases with SQL elastic pools
    7 Units
    39 min

    SQL elastic pools allow you to manage performance and cost for a collection of SQL databases.

  • Develop and configure an ASP.NET application that queries an Azure SQL database
    6 Units
    50 min

    Create a database on Azure SQL Database to store data for an application, and create an ASP.NET application to query data from the database.

5 hr 55 min
NoSQL data is an efficient way to store information that doesn’t map to the requirements of a relational SQL database. Learn how to use the Azure portal, the Azure Cosmos DB extension for Visual Studio Code, and the Azure Cosmos DB .NET Core SDK to work with your NoSQL data where you want, and provide your users with high availability, no matter where they are in the world.

Modules in this learning path

  • Create an Azure Cosmos DB database built to scale
    7 Units
    27 min

    Learn how to create an Azure Cosmos DB account, database, and container built to scale as your application grows.

  • Distribute your data globally with Azure Cosmos DB
    6 Units
    31 min

    Distribute your data globally with Azure Cosmos DB

  • Insert and query data in your Azure Cosmos DB database
    8 Units
    58 min

    Learn how to add data to your database and query NoSQL data in Azure Cosmos DB.

  • Choose the appropriate API for Azure Cosmos DB storage
    10 Units
    38 min

    You will compare the different APIs available in Azure Cosmos DB, highlighting and contrasting the different scenarios that would make them a good choice for a company.

  • Store and access graph data in Azure Cosmos DB with the Graph API
    8 Units
    42 min

    Create an application that that uses an Azure Cosmos DB as a graph database. Store, retrieve and manipulate graph data and visualize it using Data Explorer.

  • Store and Access NoSQL Data with Azure Cosmos DB and the Table API
    4 Units
    29 min

    Get lower latency, higher availability, and easier scaling by moving your Azure Storage Tables application to Azure Cosmos DB.

  • Build a .NET Core app for Azure Cosmos DB in Visual Studio Code
    7 Units
    54 min

    Learn how to build a database app to store and query data in Azure Cosmos DB by using Visual Studio Code.

  • Optimize the performance of Azure Cosmos DB by using partitioning and indexing strategies
    9 Units
    1 hr 16 min

    Learn how to measure the throughput of your NoSQL database and assess the impact of partition key and indexing strategies on throughput.

2 hr 5 min
In this learning path, you will see how Azure Data Lake Storage can make processing Big Data analytical solutions more efficient and how easy it is to set up. You will also explore how it fits into common architectures, as well as the different methods of uploading the data to the data store. You will finally examine the myriad of security features that will ensure your data is secure.

Modules in this learning path

  • Secure your Azure Storage account
    8 Units
    45 min

    Learn how Azure Storage provides multilayered security to protect your data. Find out how to use access keys, to secure networks, and to use Advanced Threat Protection to proactively monitor your system.

  • Introduction to Azure Data Lake storage
    7 Units
    30 min

    Learn how Azure Data Lake Storage provides a cloud storage service that is highly available, secure, durable, scalable, and redundant and brings new efficiencies to processing big data analytics workloads.

  • Upload data to Azure Data Lake Storage
    5 Units
    50 min

    Learn various ways to upload data to Data Lake Storage Gen 2. Upload data through the Azure portal, Azure Storage Explorer, or .NET. Or copy the data in Azure Data Factory.

1 hr 14 min
Learn the concepts of event processing and streaming data and how this applies to Azure Stream Analytics. You will then set up a stream analytics job to stream data, and learn how to manage and monitor a running job.

Modules in this learning path

  • Work with data streams by using Azure Stream Analytics
    5 Units
    25 min

    Explore how Azure Stream Analytics integrates with your applications or Internet of Things (IoT) devices to gain insights with real-time streaming data. Learn how to consume and analyze data streams and derive actionable results.

  • Transform data by using Azure Stream Analytics
    10 Units
    49 min

    Learn how to create Azure Stream Analytics jobs to process input data, transform it with a query, and return results.

2 hr 53 min
Azure Synapse Analytics provides a relational big data store that can scale to Petabytes of data. In this learning path, you will learn how Azure Synapse Analytics can achieve this scale with its’ Massively Parallel Processing (MPP) architecture. Create a data warehouse in minutes and use familiar querying language to build reports. Load massive amounts of data in minutes, and ensure that your data warehouse is secure.

Modules in this learning path

  • Design a data warehouse with Azure Synapse Analytics
    8 Units
    47 min

    Learn how to create a data warehouse in Azure Synapse Analytics in minutes to take advantage of massively parallel processing (MPP) and run complex queries at petabyte scale quickly.

  • Query data in Azure Synapse Analytics
    6 Units
    1 hr 0 min

    Learn how to issue simple queries against Azure Synapse Analytics and how to visualize the data in both Excel and Power BI.

  • Import data into Azure Synapse Analytics by using PolyBase
    8 Units
    1 hr 6 min

    Load petabytes of data into an Azure Synapse Analytics database in minutes by using PolyBase.

Additional courses

The learning paths above prepare you for the knowledge and skills needed to pass the exam and become certified. Enrolling in this track also enrolls you in the Microsoft Official Classroom course below. You can use this course as an extra reference to prepare for the exam.

Designing an Azure Data Solution

Summary

Length
2 days
Level
Beginner
Language
English

About this course

In this course, the students will design various data platform technologies into solutions that are in line with business and technical requirements. This can include on-premises, cloud, and hybrid data scenarios which incorporate relational, No-SQL or Data Warehouse data. They will also learn how to design process architectures using a range of technologies for both streaming and batch data.

The students will also explore how to design data security including data access, data policies and standards. They will also design Azure data solutions which includes the optimization, availability and disaster recovery of big data, batch processing and streaming data solutions.

Audience profile

The audience for this course is data professionals, data architects, and business intelligence professionals who want to learn about the data platform technologies that exist on Microsoft Azure.

The secondary audience for this course is individuals who develop applications that deliver content from the data platform technologies that exist on Microsoft Azure.

Prerequisites

  • Azure fundamentals
  • DP-200: Implementing an Azure Data Solution