Cloud Computing

Managed Azure Database for MySQL and PostgreSQL

June 9, 2017 App Service, Azure, Azure Data Services Platform, Azure Database for MySQL, Azure Database for PostgreSQL, Cloud Computing, Cloud Services, Data Services, Emerging Technologies, OpenSource, PaaS, SQL Data Warehouse, SQL Database, Windows Azure Development No comments

During Microsoft Build 2017(May 10th 2017) conference in Seattle, Scott Guthrie (EVP of Cloud and Enterprise Group) announced two new offerings to the Azure Database Services Platform, Azure Database for MySQL and Azure Database for PostgreSQL.

I was happy that Microsoft is filling the gap for the need of Fully Managed MYSQL and PostgreSQL . I recollect around in April I was trying to migrate this WordPress blog from Godaddy hosting in to  an Azure App Service to provide and since WordPress requires MySQL as the database. The only option left for me in Azure was to have local MySQL(MySQL in App)  in App Service, which cannot scale well or either use Clear DB service (a Microsoft partner in azure). Some how I wasn’t happy with the performance of local MYSQL and Clear DB, due to my bulky blog. So I thought what if there was a Managed MYSQL service just like Managed SQL Azure services.

What is Azure Database for MySQL and PostgreSQL?

Azure Database for MYSQL and PostgreSQL(currently in PREVIEW)  are fully managed Platform as a Service(PaaS) offering from Microsoft Azure, which does not want us to worry about infrastructure and managing the server instance.  Below is the outline of how these services has been stacked up against existing SQL Database offerings. As a customer you do not need to worry about the Compute, Storage, Networking, and high-performance/availability/scalability  of these services ensured by Azure Data Service Platform with built in monitoring.

You easily deploy an Azure Web App with Azure Database for MySQL as the database provider, and to provide complete solutions for common Content Management Systems (CMS) such as WordPress and Drupal.

2d1e1ef6-94ac-4110-bc4d-93d0b44d45aa

I will cover more details in later series That’s all for now. Thank you for reading my content. Leave your comments.

Pricing Details:

Useful Links:

Big Data & Front End Development track in the Microsoft Professional Program

June 8, 2017 Analytics, Azure, Azure Data Factory, Azure Data Lake, Big Data, Big Data Analytics, Big Data Management, Data Analytics, Data Services, Emerging Technologies, Hadoop, HD Insight, IaaS, PaaS, Predictive Analytics, Realtime Analytics, SQL Azure, Stream Analytics, Windowz Azure No comments

Earlier I introduced you the Microsoft Professional Program for Data Science. Right after few days Microsoft announced the BETA availability of two more tracks Big Data and Front End Development.

Big Data Track:

This Microsoft program will help you to learn necessary skills from cloud storage and databases to Hadoop, Spark, and managed data services in Azure. Curriculum of this program involves learning how to build big data solutions for batch and real-time stream processing using Azure managed services and open source systems like Hadoop and Spark.

Are you intend to pursue a Data Analytics career, this is the right program for you to gain necessary insights.

Technology you will apply to gain these skills are: Azure Data Lake, Hadoop, HDInsight, Spark, Azure data factory, Azure Stream Analytics

Below is the course outline :

  • 10 COURSES  |  12-30  HOURS PER COURSE  |  8  SKILLS
  • ENROLL NOW here
  • More details here

Front End Development Track:

This track provides you necessary skills to get started with Advanced Front End development using HTML5, CSS3, JavaScript, AngularJS and Bootstrap.  At the end of the curriculum you will become master in Front End Development with all predominant modern web technologies.

So if you are a front end UI developer, this is something you can try out to enhance your skills.

Below is the course outline :

  • 13 COURSES  |  15-30 HOURS PER COURSE  |  11 SKILLS
  • ENROLL NOW here
  • More details  here

Track detail

Each course runs for three months and starts at the beginning of a quarter. January—March, April—June, July—September, and October —December. The capstone runs for four weeks at the beginning of each quarter: January, April, July, October. For exact dates for the current course run, please refer to the course detail page on edX.org. 

[Microsoft]

Introduction to Data Science

June 3, 2017 Analytics, Big Data, Big Data Analytics, Big Data Management, Cloud Computing, Cold Path Analytics, Data Analytics, Data Collection, Data Hubs, Data Science, Data Scientist, Edge Analytics, Emerging Technologies, Hot Path Analytics, Human Computer Interation, Hype vs. reality, Industrial Automation, Internet of Nano Things, Internet of Things, IoT, IoT Devices, Keyword Analysis, KnowledgeBase, Machine Learning(ML), machine-to-machine (M2M), Machines, Predictive Analytics, Predictive Maintenance, Realtime Analytics, Robotics, Sentiment Analytics, Stream Analytics No comments

We all have been hearing the term Data Science and Data Scientist occupation become more popular these days. I thought of sharing some light into this specific area of science, that may seem interesting for rightly skilled readers of my blog. 

Data Science is one of the hottest topics on the Computer and Internet  nowadays. People/Corporations have gathered data from applications and systems/devices until today and now is the time to analyze them. The world wide adoption of Internet of Things has also added more scope analyzing and operating on the huge data being accumulated from these devices near real-time.

As per the standard Wikipedia definition goes Data science, also known as data-driven science, is an interdisciplinary field about scientific methods, processes and systems to extract knowledge or insights from data in various forms, either structured or unstructured, similar to data mining.”.

Data Science requires the following skillset:

  • Hacking Skills
  • Mathematics and Statistical Knowledge
  • Substantive Scientific Expertise

aoz1BJy

[Image Source: From this article by Berkeley Science Review.]

Data Science Process:

Data Science process involves collecting row data, processing data, cleaning data, data analysis using models/algorithms and visualizes them for presentational approaches.  This process is explained through a visual diagram from Wikipedia.

Data_visualization_process_v1

[Data science process flowchart, source wikipedia]

Who are Data Scientist?

Data scientists use their data and analytical ability to find and interpret rich data sources; manage large amounts of data despite hardware, software, and bandwidth constraints; merge data sources; ensure consistency of datasets; create visualizations to aid in understanding data; build mathematical models using the data; and present and communicate the data insights/findings.

They are often expected to produce answers in days rather than months, work by exploratory analysis and rapid iteration, and to produce and present results with dashboards (displays of current values) rather than papers/reports, as statisticians normally do.

Importance of Data Science and Data Scientist:

“This hot new field promises to revolutionize industries from business to government, health care to academia.”

The New York Times

Data Scientist is the sexiest job in the 21st century as per Harward Business Review.

McKinsey & Company projecting a global excess demand of 1.5 million new data scientists.

What are the skills required for a Data Scientist, let me share you a visualization through a Brain dump.

FxsL3b8

I thought of sharing an image to take you through the essential skill requirements for a Modern Data Scientist.

So what are you waiting for?, if you are rightly skilled get yourselves an Data Science Course.

Informational  Sources:

Getting Started local development with Azure Cosmos DB services – Part 2

May 29, 2017 .NET, .NET Core 1.0, .NET Core 1.0.1, .NET Framework, ASP.NET, Azure, Azure SDK Tools, Azure Tools, Cloud Computing, CodeSnippets, CosmosDB, Document DB, Microsoft, PaaS, SaaS, Visual Studio 2015, Visual Studio 2015 Update 3, Visual Studio 2017, VisualStudio, VS2015, VS2017, Windows, Windows 10, Windows Azure Development, Windowz Azure No comments

In my previous article we discussed about setting local development environment using Cosmos DB Emulator for Windows. With this part 2 of the article, we will cover developing, debugging and integration related aspects of using Cosmos DB Emulator.

Developing with Cosmos DB Emulator

Once you have Cosmos DB emulator installed and running on your machine, you can use any supported Cosmos DB SDK or Cosmos DB REST API to interact with emulator. This process is same as you are using a Cosmos DB cloud service.

Cosmos DB Emulator also provides a build-in visual explorer through which you can view,create and edit collections and documents.

image

Before you integrate Cosmos DB SDK or Cosmos DB REST API you would need to generate master key for authentication. Unlike cloud service, Cosmos DB emulator only support single fixed account and master key.  You would not be able to communicate with Emulator without this master key.

Default Master Key:

Account name: localhost:<port>

Account key: C2y6yDjf5/R+ob0N8A7Cgv30VRDJIWEHLM+4QDU5DE2nQ9nDuVTqobD4b8mGGyPMbIZnqyMsEcaGQy67XIw/Jw==

PS: This key is only to be used in Emulator. You cannot use the same key for Production(Cosmos DB Cloud Service).

Furthermore, if you want to set your own key. You can go to command line references and run DocumentDB.Emulator.exe with sufficient command switch to set your own key. Remember it should meet the key security requirements. See command-line tool reference for more information.

The Azure Cosmos DB Emulator is installed by default to the C:\Program Files\Azure Cosmos DB Emulator  or C:\Program Files\DocumentDB Emulator  directory.

Once you have account name and key, you are good to go with development and debugging using Azure Cosmos DB emulator.

Let us start looking at how to use CosmosDB SDK. Once you add Cosmos DB SDK for .NET from NUGET sources. You would need to import the following namespaces to reference necessary classes.

 using Microsoft.Azure.Documents;
   
 using Microsoft.Azure.Documents.Client;
   
 using Microsoft.Azure.Documents.Linq;

Simple code to establish connection:

// Connect to the Azure Cosmos DB Emulator running locally use DocumentClient class in : 
DocumentClient client = new DocumentClient(
    new Uri("https://localhost:8081"), 
    "C2y6yDjf5/R+ob0N8A7Cgv30VRDJIWEHLM+4QDU5DE2nQ9nDuVTqobD4b8mGGyPMbIZnqyMsEcaGQy67XIw/Jw==");

In the above code block we are directly embedding endpoint, key in the source code.But as a suggested approch keeping in mind to easily point to production service would be maintain the key in Web.config appSettings.

   <add value="https://localhost:8081/" key="endpoint"/>
    <add value="C2y6yDjf5/R+ob0N8A7Cgv30VRDJIWEHLM+4QDU5DE2nQ9nDuVTqobD4b8mGGyPMbIZnqyMsEcaGQy67XIw/Jw==" key="authKey"/>
 

Add NuGet reference to Microsoft.Azure.DocumentDB  (always use the latest version of the library)

image

For the ease of this article, I am going to use the existing ToDoList sample from DocumentDB Samples provided by Microsoft. You can originally find the same source from C:\Program Files\DocumentDB Emulator\Packages\DataExplorer\quickstart.

image

Copy and Unzip DocumentDB-Quickstart-DotNet.zip and open todo.sln in Visual Studio 2017 and your solution structure will look like below:

image

Now run the application in your Visual Studio.

1. You will see an initial screen:

image

2. Click on Create New:

image

3. New record will be added to your Azure Cosmos DB Emulator:

image

4. To verify in Cosmos DB emulator now open Cosmos DB explorer, click on Collections and Select ToDoList

image

5.Expand Documents and select item with id:da305da3-c1dc-4e34-94d9-fd7f82d26c58

image

Hope this article was helpful for you with initial development.  Share your feedback through comments and share this to your friends and colleagues.

Useful Links:

Getting Started local development with Azure Cosmos DB services – Part 1

May 20, 2017 .NET, Azure, Azure SDK, Azure SDK Tools, Azure Tools, Cloud Computing, Computing, CosmosDB, Data Services, Document DB, Emerging Technologies, KnowledgeBase, Microsoft, PaaS, Visual Studio 2013, Visual Studio 2015, Visual Studio 2017, VS2013, VS2015, VS2017, Windows 10, Windows Azure Development, Windows Server 2012 R2, Windows Server 2016, Windowz Azure 1 comment , ,

Azure Cosmos DB is a multi-API, multi-model highly scalable NoSQL database services from Microsoft Azure platform. In order to develop an application consuming Azure Cosmos DB requires an azure live subscription or emulator in your local machine.

The Azure Cosmos DB Emulator provides a local development/test environment for Azure Cosmos DB development purposes. Using Azure Cosmos DB Emulator, you can develop and test your application locally, without needing an azure subscription or without subscription costs.

With this article I am going to take you through necessary steps and requirements to set up your local environment.

1. Pre-Requisites:

Azure Cosmos DB emulator has the following software and hardware requirements:

  • Software requirements
    • Windows Server 2012 R2, Windows Server 2016, or Windows 10
  • Minimum Hardware requirements
    • 2 GB RAM
    • 10 GB available hard disk space

2. Installation:

  • Download Azure Cosmos DB Emulator   (DocumentDb.Install.msi)   ** do not get confused by the name. Azure Cosmos DB is a super set of Document DB, and the DocumentDb emulator they tweaked a bit to support Cosmos Db.
  • Install DocumentDb.Install.msi

Additionally Azure CosmosDB emulator can be run on Docker for Windows. After installing Docker for Windows, you can pull the Emulator image from Docker Hub.

docker pull microsoft/azure-documentdb-emulator

imageimageimage

3. Start/Launch Azure Cosmos DB Emulator:

image image

After some time you can see the emulator started. When the Azure Cosmos DB emulator launches it will automatically open the Azure Cosmos DB Data Explorer in your browser.

The address will appear as https://localhost:8081/_explorer/index.html

Incase you have closed browser and later would like to open the explorer again, you can open the Data Explorer by right clicking on the taskbar menu.

image

image

Now you can write some sample app to try it, or download already created sample applications from Microsoft depending on the preferred platform of your choice.

4. Limitation of Azure Cosmos DB Emulator: (or Differences between Azure Cosmos DB Emulator vs Real Cosmos DB Cloud Service)

Since the Azure Cosmos DB Emulator provides an emulated environment running on a local developer workstation, there are some fundamental differences between the emulator and an Azure Cosmos DB account in the cloud:

The following table is also helpful in determining when to use Cosmos DB Emulator and when direct cloud service. Depending on the choice of requirement, you would need to use associated services efficiently.


Cosmos DB Emulator Cosmos DB Cloud Service
Supports only a single fixed account and a well-known master key. Key regeneration is not possible. Supports multiple accounts and different master keys. You can regenerate keys any time from Azure Portal.
Non scalable Highly scalable
Does not support larger data sets Support for large data sets
Does not simulate consistency levels Different Consistency levels available
Does not simulate multi-region replication Configurable as part of the platform, as needed basis.
Does not support quota override feature Supports document size limit increases, increased partitioned collection storage etc.
Might not support most recent changes to Cosmos Db platform Most recent platform update will be available.

Hope this article was helpful for your initial start. If you would need to understand further on Azure Cosmos DB development follow the links. I will be writing further insights in later sessions.

Azure in Germany–a complete EU cloud computing solution

May 18, 2017 .NET, Analytics, AppFabric, Azure, Azure in Germany, Azure IoT Suite, Cloud Computing, Cloud Services, Cloud Strategy, Cognitive Services, Computing, Data Analytics, Data Governance, Data Hubs, Data Warehouse, Emerging Technologies, Event Hubs, IaaS, Intelligent Edge, Internet of Things, IoT, IoT Central, IoT Hub, Machine Learning(ML), Media Services, Media Services & CDN, Messaging, Microsoft, Mobile Services, PaaS, SaaS, SQL Azure, Storage, Backup & Recovery, Stream Analytics, Virtual Machines, Windowz Azure No comments

With my earlier article Azure in China, it came in to my interest to look for any other country/region specific independent cloud data center requirements.  I came across Azure for US Govt(Similar to Amazon Govt Cloud) instance and Azure Germany data center.  For this article context I will be covering only Azure in Germany.

What is Azure Germany?

Just like regional regulatory requirements in China, Germany also wanted a completely locally owned/managed Azure Data Center for EU/EFTA/UK requirements. This is also to ensure stricter access control and data access policy measurements. This  approach is to enable organizations doing business in EU/EFTA and UK can better harness the power of cloud computing.

  • All customer data and related applications and hardware reside in Germany
  • Geo-replication between datacenters in Germany to support  business continuity
  • Highly secured datacenters provide 24×7 monitoring
  • It meets all Public sector or restricted industry requirements
  • Follows all Compliance requirements for EU/EFTA and UK.
  • Lower cost, locally accessible  within your business locations in Germany/EU.

“ Azure Germany is an isolated Azure instance in Germany, independent from other public clouds.”

Who controls it?

An independent data trustee controls access to all customer data in the Azure Germany datacenters. T-Systems International GmbH, a subsidiary of Deutsche Telekom and an experienced, well-respected IT provider incorporated in Germany, serves as trustee, protecting disclosure of data to third parties except as the customer directs or as required by German law.

** Even Microsoft does not have access to customer data or the datacenters without approval from and supervision by the German data trustee.

What Compliance?

Azure Germany has an ongoing commitment to maintaining the strictest data protection measures, so organizations can store and manage customer data in compliance with applicable German laws and regulations, as well as key international standards. Additional compliance standards and controls that address the unique role of the German data trustee will be audited over time. Refer to: Microsoft Trust Center compliance.

[Source : Microsoft Azure]

Useful Links: