DP-200: Implementing an Azure Data Solution
Data professionals capture and analyze exponential amounts of data
Log in to EnrollSummary
- intermediate
- azure
- others
- azure-cosmos-db
- azure-data-lake
- azure-sql-data-warehouse
- azure-databricks
- azure-notebooks
- azure-stream-analytics
- azure-sql-database
- azure-storage
- azure-monitor
- azure-hdinsight
In response to the coronavirus (COVID-19) situation, Microsoft is implementing several temporary changes to our training and certification program. Learn more.The content of this exam was updated on September 24, 2020. Please download the skills measured document below to see what changed.Candidates for this exam are Microsoft Azure data engineers who collaborate with business stakeholders to identify and meet the data requirements to implement data solutions that use Azure data services.Azure data engineers are responsible for data-related implementation tasks that include provisioning data storage services, ingesting streaming and batch data, transforming data, implementing security requirements, implementing data retention policies, identifying performance bottlenecks, and accessing external data sources.Candidates for this exam must be able to implement data solutions that use the following Azure services: Azure Cosmos DB, Azure SQL Database, Azure Synapse Analytics (formerly Azure SQL DW), Azure Data Lake Storage, Azure Data Factory, Azure Stream Analytics, Azure Databricks, and Azure Blob storage.
Learning paths
Modules in this learning path
-
Process streaming data with Azure Databricks structured streaming
7 Units46 minProcess streaming data with Azure Databricks structured streaming
-
Build and query a Delta Lake
7 Units43 minBuild and query a Delta Lake
-
Describe Azure Databricks best practices
8 Units38 minDescribe Azure Databricks best practices
-
Work with DataFrames advanced methods in Azure Databricks
6 Units42 minWork with DataFrames advanced methods in Azure Databricks
-
Describe Azure Databricks Delta Lake architecture
5 Units33 minDescribe Azure Databricks Delta Lake architecture
-
Work with DataFrames in Azure Databricks
7 Units46 minWork with DataFrames in Azure Databricks
-
Describe Azure Databricks
7 Units53 minDescribe Azure Databricks
-
Describe lazy evaluation and other performance features in Azure Databricks
7 Units38 minDescribe lazy evaluation and other performance features in Azure Databricks
-
Describe platform architecture, security, and data protection in Azure Databricks
9 Units1 hr 3 minDescribe platform architecture, security, and data protection in Azure Databricks
-
Create production workloads on Azure Databricks with Azure Data Factory
5 Units33 minCreate production workloads on Azure Databricks with Azure Data Factory
-
Work with DataFrames columns in Azure Databricks
6 Units43 minWork with DataFrames columns in Azure Databricks
-
Integrate Azure Databricks with other Azure services
5 Units33 minIntegrate Azure Databricks with other Azure services
-
Implement CI/CD with Azure DevOps
5 Units28 minImplement CI/CD with Azure DevOps
-
Read and write data in Azure Databricks
9 Units1 hr 0 minRead and write data in Azure Databricks
-
Spark architecture fundamentals
5 Units33 minSpark architecture fundamentals
Modules in this learning path
-
Understand the evolving world of data
6 Units28 minLearn how data systems are evolving and how the changes affect data professionals. Explore the differences between on-premises and cloud data solutions, and consider sample business cases that apply cloud technologies.
-
Survey the services on the Azure Data platform
11 Units58 minLearn about Azure technologies that analyze text and images and relational, nonrelational, or streaming data. See how data engineers can choose the technologies that meet their business needs and scale to meet demand securely.
-
Identify the tasks of a data engineer in a cloud-hosted architecture
5 Units25 minLearn about the responsibilities of a data engineer. Find out how they relate to the jobs of other data and AI professionals. Explore common data engineering practices and a high-level architecting process for a data-engineering project.
Modules in this learning path
-
Choose a data storage approach in Azure
6 Units30 minLearn how using Azure Storage, Azure SQL Database, and Azure Cosmos DB – or a combination of them – for your business scenario is the best way to get the most performant solution.
-
Create an Azure Storage account
7 Units30 minCreate an Azure Storage account with the correct options for your business needs.
-
Connect an app to Azure Storage
12 Units1 hr 15 minCreate a simple application and add configuration, client library references, and code to connect it to Azure Storage.
-
Secure your Azure Storage account
8 Units45 minLearn how Azure Storage provides multilayered security to protect your data. Find out how to use access keys, to secure networks, and to use Advanced Threat Protection to proactively monitor your system.
-
Store application data with Azure Blob storage
8 Units50 minBuild an app that stores user files with Azure Blob storage.
Modules in this learning path
-
Migrate your relational data stored in SQL Server to Azure SQL Database
8 Units1 hr 24 minPlan and implement the migration of relational data from SQL Server to Azure SQL Database. Select the appropriate migration strategy for your database, and then use Data Migration Assistant to do the assessment and the migration.
-
Secure your Azure SQL Database
8 Units1 hr 7 minSecure your Azure SQL Database to keep your data secure and diagnose potential security concerns as they happen.
-
Provision an Azure SQL database to store application data
5 Units50 minUse the Azure Portal and Cloud Shell to provision, connect, and manage Azure SQL databases.
-
Create an Azure Database for PostgreSQL server
7 Units44 minAzure Database for PostgreSQL provides fully managed, enterprise-ready community PostgreSQL database as a service. Learn how to create and deploy this service and connect it to your applications.
-
Scale multiple Azure SQL Databases with SQL elastic pools
7 Units39 minSQL elastic pools allow you to manage performance and cost for a collection of SQL databases.
-
Develop and configure an ASP.NET application that queries an Azure SQL database
6 Units50 minCreate a database on Azure SQL Database to store data for an application, and create an ASP.NET application to query data from the database.
Modules in this learning path
-
Create an Azure Cosmos DB database built to scale
7 Units27 minLearn how to create an Azure Cosmos DB account, database, and container built to scale as your application grows.
-
Distribute your data globally with Azure Cosmos DB
6 Units31 minDistribute your data globally with Azure Cosmos DB
-
Insert and query data in your Azure Cosmos DB database
8 Units58 minLearn how to add data to your database and query NoSQL data in Azure Cosmos DB.
-
Choose the appropriate API for Azure Cosmos DB storage
10 Units38 minYou will compare the different APIs available in Azure Cosmos DB, highlighting and contrasting the different scenarios that would make them a good choice for a company.
-
Store and access graph data in Azure Cosmos DB with the Graph API
8 Units42 minCreate an application that that uses an Azure Cosmos DB as a graph database. Store, retrieve and manipulate graph data and visualize it using Data Explorer.
-
Store and Access NoSQL Data with Azure Cosmos DB and the Table API
4 Units29 minGet lower latency, higher availability, and easier scaling by moving your Azure Storage Tables application to Azure Cosmos DB.
-
Build a .NET Core app for Azure Cosmos DB in Visual Studio Code
7 Units54 minLearn how to build a database app to store and query data in Azure Cosmos DB by using Visual Studio Code.
-
Optimize the performance of Azure Cosmos DB by using partitioning and indexing strategies
9 Units1 hr 16 minLearn how to measure the throughput of your NoSQL database and assess the impact of partition key and indexing strategies on throughput.
Modules in this learning path
-
Secure your Azure Storage account
8 Units45 minLearn how Azure Storage provides multilayered security to protect your data. Find out how to use access keys, to secure networks, and to use Advanced Threat Protection to proactively monitor your system.
-
Introduction to Azure Data Lake storage
7 Units30 minLearn how Azure Data Lake Storage provides a cloud storage service that is highly available, secure, durable, scalable, and redundant and brings new efficiencies to processing big data analytics workloads.
-
Upload data to Azure Data Lake Storage
5 Units50 minLearn various ways to upload data to Data Lake Storage Gen 2. Upload data through the Azure portal, Azure Storage Explorer, or .NET. Or copy the data in Azure Data Factory.
Modules in this learning path
-
Work with data streams by using Azure Stream Analytics
5 Units25 minExplore how Azure Stream Analytics integrates with your applications or Internet of Things (IoT) devices to gain insights with real-time streaming data. Learn how to consume and analyze data streams and derive actionable results.
-
Transform data by using Azure Stream Analytics
10 Units49 minLearn how to create Azure Stream Analytics jobs to process input data, transform it with a query, and return results.
Modules in this learning path
-
Design a data warehouse with Azure Synapse Analytics
8 Units47 minLearn how to create a data warehouse in Azure Synapse Analytics in minutes to take advantage of massively parallel processing (MPP) and run complex queries at petabyte scale quickly.
-
Query data in Azure Synapse Analytics
6 Units1 hr 0 minLearn how to issue simple queries against Azure Synapse Analytics and how to visualize the data in both Excel and Power BI.
-
Import data into Azure Synapse Analytics by using PolyBase
8 Units1 hr 6 minLoad petabytes of data into an Azure Synapse Analytics database in minutes by using PolyBase.
Additional courses
The learning paths above prepare you for the knowledge and skills needed to pass the exam and become certified. Enrolling in this track also enrolls you in the Microsoft Official Classroom course below. You can use this course as an extra reference to prepare for the exam.
Summary
- Length
- 3 days
- Level
- Beginner
- Language
- English
About this course
In this course, the students will implement various data platform technologies into solutions that are in line with business and technical requirements including on-premises, cloud, and hybrid data scenarios incorporating both relational and No-SQL data. They will also learn how to process data using a range of technologies and languages for both streaming and batch data.
The students will also explore how to implement data security including authentication, authorization, data policies and standards. They will also define and implement data solution monitoring for both the data storage and data processing activities. Finally, they will manage and troubleshoot Azure data solutions which includes the optimization and disaster recovery of big data, batch processing and streaming data solutions.
Prerequisites
- Azure fundamentals
Audience
The primary audience for this course is data professionals, data architects, and business intelligence professionals who want to learn about the data platform technologies that exist on Microsoft Azure.
The secondary audience for this course is individuals who develop applications that deliver content from the data platform technologies that exist on Microsoft Azure.