MS-DP-200T01 - Implementing an Azure Data Solution

In this course, the students will implement various data platform technologies into solutions that are in line with business and technical requirements including on-premises, cloud, and hybrid data scenarios incorporating both relational and No-SQL data. They will also learn how to process data using a range of technologies and languages for both streaming and batch data.

****This course is not Microsoft Software Assurance Training Voucher (SATV) eligible.****

The students will also explore how to implement data security including authentication, authorization, data policies and standards. They will also define and implement data solution monitoring for both the data storage and data processing activities. Finally, they will manage and troubleshoot Azure data solutions which includes the optimization and disaster recovery of big data, batch processing and streaming data solutions.

Audience profile
The primary audience for this course is data professionals, data architects, and business intelligence professionals who want to learn about the data platform technologies that exist on Microsoft Azure.

The secondary audience for this course is individuals who develop applications that deliver content from the data platform technologies that exist on Microsoft Azure.
At course completion

Instructor did a great job, from experience this subject can be a bit dry to teach but he was able to keep it very engaging and made it much easier to focus. Student
Excellent presentation skills, subject matter knowledge, and command of the environment. Student
Instructor was outstanding. Knowledgeable, presented well, and class timing was perfect. Student

Click here to print this page »

Prerequisites


In addition to their professional experience, students who take this training should have technical knowledge equivalent to the following courses:
Azure fundamentals

Detailed Class Syllabus


Module 1: Azure for the Data Engineer


Explain the evolving world of data
Survey the services in the Azure Data Platform
Identify the tasks that are performed by a Data Engineer
Describe the use cases for the cloud in a Case Study

Module 2: Working with Data Storage


Choose a data storage approach in Azure
Create an Azure Storage Account
Explain Azure Data Lake storage
Upload data into Azure Data Lake

Module 3: Enabling Team Based Data Science with Azure Databricks


Explain Azure Databricks
Work with Azure Databricks
Read data with Azure Databricks
Perform transformations with Azure Databricks

Module 4: Building Globally Distributed Databases with Cosmos DB


Create an Azure Cosmos DB database built to scale
Insert and query data in your Azure Cosmos DB database
Build a .NET Core app for Cosmos DB in Visual Studio Code
Distribute your data globally with Azure Cosmos DB

Module 5: Working with Relational Data Stores in the Cloud


Use Azure SQL Database
Describe Azure SQL Data Warehouse
Creating and Querying an Azure SQL Data Warehouse
Use PolyBase to Load Data into Azure SQL Data Warehouse

Module 6: Performing Real-Time Analytics with Stream Analytics


Explain data streams and event processing
Data Ingestion with Event Hubs
Processing Data with Stream Analytics Jobs

Module 7: Orchestrating Data Movement with Azure Data Factory


Explain how Azure Data Factory works
Azure Data Factory Components
Azure Data Factory and Databricks

Module 8: Securing Azure Data Platforms


An introduction to security
Key security components
Securing Storage Accounts and Data Lake Storage
Securing Data Stores
Securing Streaming Data

Module 9: Monitoring and Troubleshooting Data Storage and Processing


Explain the monitoring capabilities that are available
Troubleshoot common data storage issues
Troubleshoot common data processing issues
Manage disaster recovery