Mar 2016 updated: Pass4sure Microsoft 70-534 free draindumps 17-32

Exam Code: 70-534 (Practice Exam Latest Test Questions VCE PDF)
Exam Name: Architecting Microsoft Azure Solutions
Certification Provider: Microsoft
Free Today! Guaranteed Training- Pass 70-534 Exam.

2016 Mar 70-534 Study Guide Questions:

Q17. - (Topic 6) 

A company has a very large dataset that includes sensitive information. The dataset is over 30 TB in size. 

You have a standard business-class ISP internet connection that is rated at 100 megabits/second. 

You have 10 4-TB hard drives that are approved to work with the Azure Import/Export Service. 

You need to migrate the dataset to Azure. The solution must meet the following requirements: 

The dataset must be transmitted securely to Azure. 

Network bandwidth must not increase. 

Hardware costs must be minimized. 

What should you do? 

A. Prepare the drives with the Azure Import/Export tool and then create the import job. Ship the drives to Microsoft via a supported carrier service. 

B. Create an export job and then encrypt the data on the drives by using the Advanced Encryption Standard (AES). Create a destination Blob to store the export data. 

C. Create an import job and then encrypt the data on the drives by using the Advanced Encryption Standard (AES). Create a destination Blob to store the import data. 

D. Prepare the drives by using Sysprep.exe and then create the import job. Ship the drives to Microsoft via a supported carrier service. 

Answer: A 

Explanation: You can use the Microsoft Azure Import/Export service to transfer large amounts of file data to Azure Blob storage in situations where uploading over the network is prohibitively expensive or not feasible. 

Reference: Use the Microsoft Azure Import/Export Service to Transfer Data to Blob Storage 

http://azure.microsoft.com/en-gb/documentation/articles/storage-import-export-service/ 


Q18. - (Topic 6) 

You are designing a solution that will interact with non-Windows applications over unreliable network connections. You have a security token for each non-Windows application. 

You need to ensure that non-Windows applications retrieve messages from the solution. 

Where should you retrieve messages? 

A. An Azure Queue 

B. The Azure Service Bus Queue 

C. An Azure blob storage container that has a private access policy 

D. Azure Table storage 

Answer: B 

Explanation: Any Microsoft or non-Microsoft applications can use a Service Bus REST 

API to manage and access messaging entities over HTTPS. 

By using REST applications based on non-Microsoft technologies (e.g. Java, Ruby, etc.) 

are allowed not only to send and receive messages from the Service Bus, but also to 

create or delete queues, topics and subscription in a given namespace. 

: Service Bus Explorer 

https://code.msdn.microsoft.com/windowsazure/service-bus-explorer-f2abca5a 


Q19. - (Topic 6) 

You design an Azure web application. The web application is accessible by default as a standard cloudapp.net URL. 

You need to recommend a DNS resource record type that will allow you to configure access to the web application by using a custom domain name. 

Which DNS record type should you recommend? 

A. SRV 

B. MX 

C. CNAME 

D. A 

Answer: C 

Explanation: A CNAME record maps a specific domain, such as contoso.com or www.contoso.com, to a canonical domain name. In this case, the canonical domain name is the <myapp>.cloudapp.net domain name of your Azure hosted application. Once created, the CNAME creates an alias for the <myapp>.cloudapp.net. The CNAME entry will resolve to the IP address of your <myapp>.cloudapp.net service automatically, so if the IP address of the cloud service changes, you do not have to take any action. 

Incorrect: Not D: 

Since an A record is mapped to a static IP address, it cannot automatically resolve changes to the IP address of your Cloud Service. 

An A record maps a domain, such as contoso.com or www.contoso.com, or a wildcard domain such as *.contoso.com, to an IP address. In the case of an Azure Cloud Service, the virtual IP of the service. So the main benefit of an A record over a CNAME record is that you can have one entry that uses a wildcard, such as *.contoso.com, which would handle requests for multiple sub-domains such as mail.contoso.com, login.contoso.com, or 

www.contso.com. 

Reference: Configuring a custom domain name for an Azure cloud service 

http://azure.microsoft.com/en-gb/documentation/articles/cloud-services-custom-domain-name/ 


Q20. - (Topic 6) 

You are designing a distributed application for Azure. 

The application must securely integrate with on-premises servers. 

You need to recommend a method of enabling Internet Protocol security (IPsec)-protected 

connections between on-premises servers and the distributed application. 

What should you recommend? 

A. Azure Access Control 

B. Azure Content Delivery Network (CDN) 

C. Azure Service Bus 

D. Azure Site-to-Site VPN 

Answer: D 

Explanation: IPsec can be used on Azure Site-to-Site VPN connections. Distributed applications can used the IPSec VPN connections to communicate. 

Reference: About Virtual Network Secure Cross-Premises Connectivity 

https://msdn.microsoft.com/en-us/library/azure/dn133798.aspx 


Q21. - (Topic 5) 

You need to recommend a technology for processing customer pickup requests. 

Which technology should you recommend? 

A. Notification hub 

B. Queue messaging 

C. Mobile Service with push notifications 

D. Service Bus messaging 

Answer: D 

Explanation: Service Bus queues are part of a broader Azure messaging infrastructure 

that supports queuing as well as publish/subscribe, Web service remoting, and integration 

patterns. 

Service Bus Queue support Push-style API (while Azure Queue messaging does not). 

Incorrect: 

Not A: Notification Hub is only used to push notification, not for processing requests. 

Not B As a solution architect/developer, you should consider using Azure Queues when: 

Your application must store over 80 GB of messages in a queue, where the messages have a lifetime shorter than 7 days. 

Your application wants to track progress for processing a message inside of the queue. This is useful if the worker processing a message crashes. A subsequent worker can then use that information to continue from where the prior worker left off. 

You require server side logs of all of the transactions executed against your queues. 

Not C: To process the messages we do not need push notification. 

Reference: Azure Queues and Service Bus Queues - Compared and Contrasted 

https://msdn.microsoft.com/en-us/library/azure/hh767287.aspx 


Topic 6, Mix Questions

31. - (Topic 6) 

Contoso, Ltd., uses Azure websites for public-facing customer websites. The company has a mobile app that requires customers sign in by using a Contoso customer account. 

Customers must be able to sign on to the websites and mobile app by using a Microsoft, Facebook, or Google account. All transactions must be secured in-transit regardless of device. 

You need to configure the websites and mobile app to work with external identity providers. 

Which three actions should you perform? Each correct answer presents part of the solution. 

A. Request a certificate from a domain registrar for the website URL, and enable TLS/SSL. 

B. Configure IPsec for the websites and the mobile app. 

C. Configure the KerberosTokenProfile 1.1 protocol. 

D. Configure OAuth2 to connect to an external authentication provider. 

E. Build an app by using MVC 5 that is hosted in Azure to provide a framework for the underlying authentication. 

Answer: A,D,E 

Explanation: DE: This tutorial shows you how to build an ASP.NET MVC 5 web application that enables users to log in using OAuth 2.0 with credentials from an external authentication provider, such as Facebook, Twitter, LinkedIn, Microsoft, or Google. 

A: 

You will now be redirected back to the Register page of the MvcAuth application where you can register your Google account. You have the option of changing the local email registration name used for your Gmail account, but you generally want to keep the default email alias (that is, the one you used for authentication). Click Register. 

To connect to authentication providers like Google and Facebook, you will need to set up IIS-Express to use SSL. 

Reference: Code! MVC 5 App with Facebook, Twitter, LinkedIn and Google OAuth2 Sign-on (C#) 

http://www.asp.net/mvc/overview/security/create-an-aspnet-mvc-5-app-with-facebook-and-google-oauth2-and-openid-sign-on 


70-534 actual exam

Abreast of the times 70-534 pdf exam:

Q22. DRAG DROP - (Topic 3) 

You need to deploy the virtual machines to Azure. 

Which four Azure PowerShell scripts should you run in sequence? To answer, move the appropriate scripts from the list of scripts to the answer area and arrange them in the correct order. 


Answer: 



Q23. - (Topic 6) 

You have business services that run on an on-premises mainframe server. 

You must provide an intermediary configuration to support existing business services and Azure. The business services cannot be rewritten. The business services are not exposed externally. 

You need to recommend an approach for accessing the business services. 

What should you recommend? 

A. Connect to the on-premises server by using a custom service in Azure. 

B. Expose the business services to the Azure Service Bus by using a custom service that uses relay binding. 

C. Expose the business services externally. 

D. Move all business service functionality to Azure. 

Answer: B 

Explanation: The Service Bus relay service enables you to build hybrid applications that run in both an Azure datacenter and your own on-premises enterprise environment. The Service Bus relay facilitates this by enabling you to securely expose Windows Communication Foundation (WCF) services that reside within a corporate enterprise network to the public cloud, without having to open a firewall connection, or require intrusive changes to a corporate network infrastructure. 

Reference: How to Use the Service Bus Relay Service 

http://azure.microsoft.com/en-gb/documentation/articles/service-bus-dotnet-how-to-use-relay/ 


Q24. - (Topic 6) 

An application currently resides on an on-premises virtual machine that has 2 CPU cores, 4 GB of RAM, 20 GB of hard disk space, and a 10 megabit/second network connection. 

You plan to migrate the application to Azure. You have the following requirements: 

You must not make changes to the application. 

You must minimize the costs for hosting the application. 

You need to recommend the appropriate virtual machine instance type. 

Which virtual machine tier should you recommend? 

A. Network Optimized (A Series) 

B. General Purpose Compute, Basic Tier (A Series) 

C. General Purpose Compute, Standard Tier (A Series) 

D. Optimized Compute (D Series) 

Answer: B 

Explanation: General purpose compute: Basic tier An economical option for development workloads, test servers, and other applications that don't require load balancing, auto-scaling, or memory-intensive virtual machines. 

CPU core range: 1-8 RAM range: 0.75 – 14 GB Disk size: 20-240 GB 

Reference: Virtual Machines Pricing. Launch Windows Server and Linux in minutes 

http://azure.microsoft.com/en-us/pricing/details/virtual-machines/ 


Q25. - (Topic 1) 

You need to design the system that alerts project managers to data changes in the contractor information app. 

Which service should you use? 

A. Azure Mobile Service 

B. Azure Service Bus Message Queueing 

C. Azure Queue Messaging 

D. Azure Notification Hub 

Answer: C 

Explanation: * Scenario: 

/ Mobile Apps: Event-triggered alerts must be pushed to mobile apps by using a custom 

Node.js script. 

/ The service level agreement (SLA) for the solution requires an uptime of 99.9% 

* If you are already using Azure Storage Blobs or Tables and you start using queues, you are guaranteed 99.9% availability. If you use Blobs or Tables with Service Bus queues, you will have lower availability. 

Note: Microsoft Azure supports two types of queue mechanisms: Azure Queues and Service Bus Queues. / Azure Queues, which are part of the Azure storage infrastructure, feature a simple REST-based Get/Put/Peek interface, providing reliable, persistent messaging within and between services. / Service Bus queues are part of a broader Azure messaging infrastructure that supports queuing as well as publish/subscribe, Web service remoting, and integration patterns. 

: Azure Queues and Service Bus Queues - Compared and Contrasted 

https://msdn.microsoft.com/en-us/library/azure/hh767287.aspx 


Q26. DRAG DROP - (Topic 6) 

You are the Azure architect for an organization. You are working with C-level management to assign Azure role-based access control roles to a team within the organization. A single 

director oversees two teams, a development team and a test team. The director is wholly responsible for the organization's Azure account, including billing, infrastructure, and access control. The director is the only member of the team with the ability to alter access controls. 

You have the following requirements: 

. Members of the development team must be able to view or alter Azure infrastructure to support application development. . Members of the test team must be able to view Azure infrastructure to support test cases. 

You need to assign built-in Azure role-based access control roles to team members within the organization. 

Which role should you assign to each team member? To answer, drag the appropriate role to the correct team member. Each role may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. 


Answer: 



70-534 study guide

Verified 70-534 answers:

Q27. - (Topic 6) 

You are designing an Azure web application that includes many static content files. 

The application is accessed from locations all over the world by using a custom domain name. 

You need to recommend an approach for providing access to the static content with the least amount of latency. 

Which two actions should you recommend? Each correct answer presents part of the solution. 

A. Place the static content in Azure Table storage. 

B. Configure a CNAME DNS record for the Azure Content Delivery Network (CDN) domain. 

C. Place the static content in Azure Blob storage. 

D. Configure a custom domain name that is an alias for the Azure Storage domain. 

Answer: B,C 

Explanation: B: There are two ways to map your custom domain to a CDN endpoint. 

1. 

Create a CNAME record with your domain registrar and map your custom domain and subdomain to the CDN endpoint 

2. 

Add an intermediate registration step with Azure cdnverify 

C: The Azure Content Delivery Network (CDN) offers developers a global solution for delivering high-bandwidth content by caching blobs and static content of compute instances at physical nodes in the United States, Europe, Asia, Australia and South America. The benefits of using CDN to cache Azure data include: / Better performance and user experience for end users who are far from a content source, and are using applications where many 'internet trips' are required to load content / Large distributed scale to better handle instantaneous high load, say, at the start of an event such as a product launch 

Reference: Using CDN for Azure https://azure.microsoft.com/en-gb/documentation/articles/cdn-how-to-use/ 

Reference: How to map Custom Domain to Content Delivery Network (CDN) endpoint 

https://github.com/Azure/azure-content/blob/master/articles/cdn-map-content-to-custom-domain.md 

https://github.com/Azure/azure-content/blob/master/articles/cdn-map-content-to-custom-domain.md 


Q28. DRAG DROP - (Topic 2) 

You need to recommend a test strategy for the disaster recovery system. 

What should you do? To answer, drag the appropriate test strategy to the correct business application. Each test strategy may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. 


Answer: 



Topic 3, Contoso, Ltd

Background

Overview

Contoso, Ltd., manufactures and sells golf clubs and golf balls. Contoso also sells golf accessories under the Contoso Golf and Odyssey brands worldwide.

Most of the company's IT infrastructure is located in the company's Carlsbad, California, headquarters. Contoso also has a sizable third-party colocation datacenter that costs the company USD $30,000 to $40,000 a month. Contoso has other servers scattered around the United States.

Contoso, Ltd., has the following goals:

Move many consumer-facing websites, enterprise databases, and enterprise web services to Azure.

Improve the performance for customers and resellers who are access company websites from around the world.

Provide support for provisioning resources to meet bursts of demand.

Consolidate and improve the utilization of website- and database-hosting resources.

Avoid downtime, particularly that caused by web and database server updating.

Leverage familiarity with Microsoft server management tools.

Infrastructure

Contoso's datacenters are filled with dozens of smaller web servers and databases that run on under-utilized hardware. This creates issues for data backup. Contoso currently backs up data to tape by using System Center Data Protection Manager. System Center Operations Manager is not deployed in the enterprise.

All of the servers are expensive to acquire and maintain, and scaling the infrastructure takes significant time. Contoso conducts weekly server maintenance, which causes downtime for some of its global offices. Special events, such as high-profile golf tournaments, create a large increase in site traffic. Contoso has difficulty scaling the webhosting  environment fast enough to meet these surges in site traffic.

Contoso has resellers and consumers in Japan and China. These resellers must use applications that run in a datacenter that is located in the state of Texas, in the United States. Because of the physical distance, the resellers experience slow response times and downtime.

Business Requirements

Management and Performance

Management

Web servers and databases must automatically apply updates to the operating system and products.

Automatically monitor the health of worldwide sites, databases, and virtual machines.

Automatically back up the website and databases.

Manage hosted resources by using on-premises tools.

Performance

The management team would like to centralize data backups and eliminate the use of tapes.

The website must automatically scale without code changes or redeployment.

Support changes in service tier without reconfiguration or redeployment.

Site-hosting must automatically scale to accommodate data bandwidth and number of connections.

Scale databases without requiring migration to a larger server.

Migrate business critical applications to Azure.

Migrate databases to the cloud and centralize databases where possible.

Business Continuity and Support

Business Continuity

Minimize downtime in the event of regional disasters.

Recover data if unintentional modifications or deletions are discovered.

Run the website on multiple web server instances to minimize downtime and support a high service level agreement (SLA).

Connectivity

Allow enterprise web services to access data and other services located onpremises.

Provide and monitor lowest latency possible to website visitors.

Automatically balance traffic among all web servers.

Provide secure transactions for users of both legacy and modern browsers.

Provide automated auditing and reporting of web servers and databases.

Support single sign-on from multiple domains.

Development Environment

You identify the following requirements for the development environment:

Support the current development team's knowledge of Microsoft web development and SQL Service tools.

Support building experimental applications by using data from the Azure deployment and on-premises data sources.

Mitigate the need to purchase additional tools for monitoring and debugging.

System designers and architects must be able to create custom Web APIs without requiring any coding.

Support automatic website deployment from source control.

Support automated build verification and testing to mitigate bugs introduced during builds.

Manage website versions across all deployments.

Ensure that website versions are consistent across all deployments.

Technical Requirement

Management and Performance

Management

Use build automation to deploy directly from Visual Studio.

Use build-time versioning of assets and builds/releases.

Automate common IT tasks such as VM creation by using Windows PowerShell workflows.

Use advanced monitoring features and reports of workloads in Azure by using existing Microsoft tools.

Performance

Websites must automatically load balance across multiple servers to adapt to varying traffic.

In production, websites must run on multiple instances.

First-time published websites must be published by using Visual Studio and scaled to a single instance to test publishing.

Data storage must support automatic load balancing across multiple servers.

Websites must adapt to wide increases in traffic during special events.

Azure virtual machines (VMs) must be created in the same datacenter when applicable.

Business Continuity and Support

Business Continuity

Automatically co-locate data and applications in different geographic locations.

Provide real-time reporting of changes to critical data and binaries.

Provide real-time alerts of security exceptions.

Unwanted deletions or modifications of data must be reversible for up to one month, especially in business critical applications and databases.

Any cloud-hosted servers must be highly available.

Enterprise Support

The solution must use stored procedures to access on-premises SQL Server data from Azure.

A debugger must automatically attach to websites on a weekly basis. The scripts that handle the configuration and setup of debugging cannot work if there is a delay in attaching the debugger.

14. HOTSPOT - (Topic 3) 

You need implement tools at the client's location for monitoring and deploying Azure resources. 

Which tools should you use? To answer, select the appropriate on-premises tool for each task in the answer area. 


Answer: 



Q29. - (Topic 6) 

You are evaluating an Azure application. The application includes the following elements: 

. A web role that provides the ASP.NET user interface and business logic 

. A single SQL database that contains all application data 

Each webpage must receive data from the business logic layer before returning results to the client. Traffic has increased significantly. The business logic is causing high CPU usage. 

You need to recommend an approach for scaling the application. 

What should you recommend? 

A. Store the business logic results in Azure Table storage. 

B. Vertically partition the SQL database. 

C. Move the business logic to a worker role. 

D. Store the business logic results in Azure local storage. 

Answer: C 

Explanation: For Cloud Services in Azure applications need both web and worker roles to scale well. 

Reference: Application Patterns and Development Strategies for SQL Server in Azure Virtual Machines 

https://msdn.microsoft.com/en-us/library/azure/dn574746.aspx 


Q30. - (Topic 1) 

You need to recommend a solution that allows partners to authenticate. 

Which solution should you recommend? 

A. Configure the federation provider to trust social identity providers. 

B. Configure the federation provider to use the Azure Access Control service. 

C. Create a new directory in Azure Active Directory and create a user account for the partner. 

D. Create an account on the VanArsdel domain for the partner and send an email message that contains the password to the partner. 

Answer: B 

Explanation: * Scenario: The partners all use Hotmail.com email addresses. 

* In Microsoft Azure Active Directory Access Control (also known as Access Control Service or ACS), an identity provider is a service that authenticates user or client identities and issues security tokens that ACS consumes. The ACS Management Portal provides built-in support for configuring Windows Live ID as an ACS Identity Provider. 

Incorrect: 

Not C, not D: Scenario: VanArsdel management does NOT want to create and manage 

user accounts for partners. 

Reference: Identity Providers 

https://msdn.microsoft.com/en-us/library/azure/gg185971.aspx 


Topic 2, Trey Research

Background

Overview

Trey Research conducts agricultural research and sells the results to the agriculture and food industries. The company uses a combination of on-premises and third-party server clusters to meet its storage needs. Trey Research has seasonal demands on its services, with up to 50 percent drops in data capacity and bandwidth demand during low-demand periods. They plan to host their websites in an agile, cloud environment where the company can deploy and remove its websites based on its business requirements rather than the requirements of the hosting company.

A recent fire near the datacenter that Trey Research uses raises the management team's awareness of the vulnerability of hosting all of the company's websites and data at any single location. The management team is concerned about protecting its data from loss as a result of a disaster.

Websites

Trey Research has a portfolio of 300 websites and associated background processes that are currently hosted in a third-party datacenter. All of the websites are written in ASP.NET, and the background processes use Windows Services. The hosting environment costs Trey Research approximately S25 million in hosting and maintenance fees.

Infrastructure

Trey Research also has on-premises servers that run VMs to support line-of-business applications. The company wants to migrate the line-of-business applications to the cloud, one application at a time. The company is migrating most of its production VMs from an aging VMWare ESXi farm to a Hyper-V cluster that runs on Windows Server 2012.

Applications

DistributionTracking

Trey Research has a web application named Distributiontracking. This application constantly collects realtime data that tracks worldwide distribution points to customer retail sites. This data is available to customers at all times.

The company wants to ensure that the distribution tracking data is stored at a location that is geographically close to the customers who will be using the information. The system must continue running in the event of VM failures without corrupting data. The system is processor intensive and should be run in a multithreading environment.

HRApp

The company has a human resources (HR) application named HRApp that stores data in an on-premises SQL Server database. The database must have at least two copies, but data to support backups and business continuity must stay in Trey Research locations only.

The data must remain on-premises and cannot be stored in the cloud.

HRApp was written by a third party, and the code cannot be modified. The human resources data is used by all business offices, and each office requires access to the entire database. Users report that HRApp takes all night to generate the required payroll reports, and they would like to reduce this time.

MetricsTracking

Trey Research has an application named MetricsTracking that is used to track analytics for the DistributionTracking web application. The data MetricsTracking collects is not customer-facing. Data is stored on an on-premises SQL Server database, but this data should be moved to the cloud. Employees at other locations access this data by using a remote desktop connection to connect to the application, but latency issues degrade the functionality.

Trey Research wants a solution that allows remote employees to access metrics data without using a remote desktop connection. MetricsTracking was written in-house, and the development team is available to make modifications to the application if necessary.

However, the company wants to continue to use SQL Server for MetricsTracking.

Business Requirements

Business Continuity

You have the following requirements:

Move all customer-facing data to the cloud.

Web servers should be backed up to geographically separate locations,

If one website becomes unavailable, customers should automatically be routed to websites that are still operational.

Data must be available regardless of the operational status of any particular website.

The HRApp system must remain on-premises and must be backed up.

The MetricsTracking data must be replicated so that it is locally available to all Trey Research offices.

Auditing and Security

You have the following requirements:

Both internal and external consumers should be able to access research results.

Internal users should be able to access data by using their existing company credentials without requiring multiple logins.

Consumers should be able to access the service by using their Microsoft credentials.

Applications written to access the data must be authenticated.

Access and activity must be monitored and audited.

Ensure the security and integrity of the data collected from the worldwide distribution points for the distribution tracking application.

Storage and Processing

You have the following requirements:

Provide real-time analysis of distribution tracking data by geographic location.

Collect and store large datasets in real-time data for customer use.

Locate the distribution tracking data as close to the central office as possible to improve bandwidth.

Co-locate the distribution tracking data as close to the customer as possible based on the customer's location.

Distribution tracking data must be stored in the JSON format and indexed by metadata that is stored in a SQL Server database.

Data in the cloud must be stored in geographically separate locations, but kept with the same political boundaries.

Technical Requirements

Migration

You have the following requirements:

Deploy all websites to Azure.

Replace on-premises and third-party physical server clusters with cloud-based solutions.

Optimize the speed for retrieving exiting JSON objects that contain the distribution tracking data.

Recommend strategies for partitioning data for load balancing.

Auditing and Security

You have the following requirements:

Use Active Directory for internal and external authentication.

Use OAuth for application authentication.

Business Continuity

You have the following requirements:

Data must be backed up to separate geographic locations.

Web servers must run concurrent versions of all websites in distinct geographic locations.

Use Azure to back up the on-premises MetricsTracking data.

Use Azure virtual machines as a recovery platform for MetricsTracking and HRApp.

Ensure that there is at least one additional on-premises recovery environment for the HRApp.

9. DRAG DROP - (Topic 2) 

You need to ensure that customer data is secured both in transit and at rest. 

Which technologies should you recommend? To answer, drag the appropriate technology to the correct security requirement. Each technology may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. 


Answer: 



Q31. - (Topic 6) 

A company has 10 on-premises SQL databases. The company plans to move the databases to SQL Server 2012 that runs in Azure Infrastructure-as-a-Service (IaaS). After migration, the databases will support a limited number of Azure websites in the same Azure Virtual Network. 

You have the following requirements: 

. You must restore copies of existing on-premises SQL databases to the SQL 

servers that run in Azure IaaS. 

. You must be able to manage the SQL databases remotely. 

. You must not open a direct connection from all of the machines on the on-

premises network to Azure. 

. Connections to the databases must originate from only five Windows computers. 

You need to configure remote connectivity to the databases. 

Which technology solution should you implement? 

A. Azure Virtual Network site-to-site VPN 

B. Azure Virtual Network multi-point VPN 

C. Azure Virtual Network point-to-site VPN 

D. Azure ExpressRoute 

Answer: C 

Explanation: A point-to-site VPN would meet the requirements. 

Reference: Configure a Point-to-Site VPN connection to an Azure Virtual Network 

https://azure.microsoft.com/en-us/documentation/articles/vpn-gateway-point-to-site-create/ 


Q32. - (Topic 6) 

You design an Azure application that processes images. The maximum size of an image is 10 MB. The application includes a web role that allows users to upload images and a worker role with multiple instances that processes the images. The web role communicates with the worker role by using an Azure Queue service. 

You need to recommend an approach for storing images that minimizes storage transactions. 

What should you recommend? 

A. Store images in Azure Blob service. Store references to the images in the queue. 

B. Store images in the queue. 

C. Store images in OneDrive attached to the worker role instances. Store references to the images in the queue. 

D. Store images in local storage on the web role instance. Store references to the images in the queue. 

Answer: A 

Explanation: Azure Queues provide a uniform and consistent programming model across queues, tables, and BLOBs – both for developers and for operations teams. Microsoft Azure blob storage can be used to store the image data, the application can use a worker role in Azure to perform background processing tasks on the images, how the application may use shared access signatures to control access to the images by users. Azure blobs provide a series of containers aimed at storing text or binary data. Block blob containers are ideal for streaming data, while page blob containers can be used for random 

read/write operations. Reference: 5 – Executing Background Tasks https://msdn.microsoft.com/en-gb/library/ff803365.aspx Reference: Azure Queues and Service Bus Queues - Compared and Contrasted https://msdn.microsoft.com/en-us/library/azure/hh767287.aspx