Cloud Computing

General Availability of Azure Database Services for MYSQL and PostgreSQL

March 23, 2018 Azure, Cloud Computing, Cloud Services, Data Services, Emerging Technologies, Microsoft, Tech Newz, VisualStudio, VS2017 No comments

It has been a while I have written something on my blog. I thought of getting started again with a good news that Microsoft Azure team has announced the general availability of Azure Database Services for MySQL and PostgreSQL. In my earlier posts, I have provided some oversight into Preview Availability of these services as part of the Azure cloud. Now that it is generally available, customers should be able to utilize these services for their general purpose or enterprise level database requirements in Azure Cloud.

You may read about it more on Microsoft announcement blog Announcing general availability of Azure database services for MySQL and PostgreSQL  by Tobias Ternstrom Principal Group Program Manager, Azure Data

IoT is not all about Cloud

October 15, 2017 Cloud Computing, Connected, Connectivity, Emerging Technologies, Internet of Things, IoT, Machines No comments

Recent past, I had multiple discussions with many tech forums and many people have a misconception about IoT and Cloud. Some think whenever we do something like blinking an LED with Raspberry Pi or Arduino is IoT.

I just thought of sharing some of my viewpoints on these terminologies.

  • Internet of Things(IoT) – refers to the connection of devices (other than the usual examples such as computers and smartphones) to the Internet. Cars, Home and Kitchen appliances, Industrial devices, and even heart monitors can all be connected through the IoT.
  • Cloud Computing – often called simply “the cloud,” involves delivering data, applications, photos, videos, and more over the Internet to data centers.

We can break down cloud computing into six different categories:

  1. Software as a service (SaaS): Cloud-based applications run on computers off-site (or “in the cloud”). Other people or companies own and operate these devices, which connect to users’ computers, typically through a web browser.
  2. Platform as a service (PaaS): Here, the cloud houses everything necessary to build and deliver cloud-based applications. This removes the need to purchase and maintain hardware, software, hosting, and more.
  3. Infrastructure as a service (IaaS): IaaS provides companies with servers, storage, networking, and data centers on a per-use basis.
  4. Public Cloud: Companies own and operate these spaces and provide quick access to users over a public network.Example: Amazon AWS, Microsoft Azure etc.
  5. Private Cloud: Similar to a public cloud, except only one entity (user, organization, company, etc.) has access. Means the access to the cloud is secured and isolated, only organizational entities have access to this type of cloud resources. A private cloud is owned by a single organization. Private clouds enable an organization to use cloud computing technology as a means of centralizing access to IT resources by different parts, locations, or departments of the organization. When a private cloud exists as a controlled environment within Onpremise data centers.
  6. Hybrid Cloud: Takes the foundation of a private cloud but provides public cloud access. This combination would be established through a secure high-speed VPN tunnel over MPLS or other dedicated lines or extended connectivity gateways provided by the respective cloud vendor. In this mode, your on-premise applications can connect to cloud infrastructure and vice versa. This provides you the flexibility to host your missing critical information in on-premise itself, but also provides you the flexibility to utilize the cloud power, without compromising your organization’s critical data.

Role of Cloud in IoT

Cloud is simply an enabler for IoT. It provides necessary services and infrastructure for things to be interconnected and operate.

Cloud provides all the essential services to increases efficiency in implementing your IoT solutions, accumulate and operate on IoT data. Internet of things requires Cloud to work, I would better define it as Cloud and IoT are inseparable, but IoT is not all about Cloud.

For example, millions of devices connected in an IoT ecosystem would create millions of bytes of data, and you would need sufficient infrastructure to store and operate on these data to create a meaningful result out of it.

Cloud Service providers started realizing the need of providing IoT specific services to customers to quickly enable to create Fast to market solutions. That’s where Cloud and IoT converges. Microsoft has packages all IoT related components into Azure IoT and hence Amazon AWS IoT, similarly the remaining providers such as SAP Hana, IBM Cloud etc. This helps customers from picking necessary components and build their IoT ecosystem in Cloud, or utilize the predefined(SaaS) solutions for quick enablement.

What is the role of Raspberry Pi, Arduino and Dragon board then?

These are single board computer or hardware boards(CPUs) or Microcontroller boards that have sufficient hardware capacity to run a small/complex IoT program on an operating system of your choice.

These boards are typically equipped with your basic storage and computing needs for establishing an IoT device or edge capability. You can write a program of your choice to blink an LED based on your conditions, as they are equipped with digital/analog I/O ports. You can choose from wide variety of operating systems such as Raspebian, Windows 10IoT etc to install on these devices or deploy microcontroller programs depending on the capacity.

This means they are edge devices which you can program for your IoT use case. When deployed to the field together, they would create an IoT network.

Conclusion:

Enough said, IoT is not all about the cloud, but are inseparable in a modern world and whatever you are doing with RaspeberryPi, Arduino Uno etc may not be an IoT unless there are a specific IoT use cases you are not trying to solve using these devices.

Useful References:

Managed Azure Database for MySQL and PostgreSQL

June 9, 2017 App Service, Azure, Azure Data Services Platform, Azure Database for MySQL, Azure Database for PostgreSQL, Cloud Computing, Cloud Services, Data Services, Emerging Technologies, OpenSource, PaaS, SQL Data Warehouse, SQL Database, Windows Azure Development No comments

During Microsoft Build 2017(May 10th 2017) conference in Seattle, Scott Guthrie (EVP of Cloud and Enterprise Group) announced two new offerings to the Azure Database Services Platform, Azure Database for MySQL and Azure Database for PostgreSQL.

I was happy that Microsoft is filling the gap for the need of Fully Managed MYSQL and PostgreSQL . I recollect around in April I was trying to migrate this WordPress blog from Godaddy hosting in to  an Azure App Service to provide and since WordPress requires MySQL as the database. The only option left for me in Azure was to have local MySQL(MySQL in App)  in App Service, which cannot scale well or either use Clear DB service (a Microsoft partner in azure). Some how I wasn’t happy with the performance of local MYSQL and Clear DB, due to my bulky blog. So I thought what if there was a Managed MYSQL service just like Managed SQL Azure services.

What is Azure Database for MySQL and PostgreSQL?

Azure Database for MYSQL and PostgreSQL(currently in PREVIEW)  are fully managed Platform as a Service(PaaS) offering from Microsoft Azure, which does not want us to worry about infrastructure and managing the server instance.  Below is the outline of how these services has been stacked up against existing SQL Database offerings. As a customer you do not need to worry about the Compute, Storage, Networking, and high-performance/availability/scalability  of these services ensured by Azure Data Service Platform with built in monitoring.

You easily deploy an Azure Web App with Azure Database for MySQL as the database provider, and to provide complete solutions for common Content Management Systems (CMS) such as WordPress and Drupal.

2d1e1ef6-94ac-4110-bc4d-93d0b44d45aa

I will cover more details in later series That’s all for now. Thank you for reading my content. Leave your comments.

Pricing Details:

Useful Links:

Big Data & Front End Development track in the Microsoft Professional Program

June 8, 2017 Analytics, Azure, Azure Data Factory, Azure Data Lake, Big Data, Big Data Analytics, Big Data Management, Data Analytics, Data Services, Emerging Technologies, Hadoop, HD Insight, IaaS, PaaS, Predictive Analytics, Realtime Analytics, SQL Azure, Stream Analytics, Windowz Azure No comments

Earlier I introduced you the Microsoft Professional Program for Data Science. Right after few days Microsoft announced the BETA availability of two more tracks Big Data and Front End Development.

Big Data Track:

This Microsoft program will help you to learn necessary skills from cloud storage and databases to Hadoop, Spark, and managed data services in Azure. Curriculum of this program involves learning how to build big data solutions for batch and real-time stream processing using Azure managed services and open source systems like Hadoop and Spark.

Are you intend to pursue a Data Analytics career, this is the right program for you to gain necessary insights.

Technology you will apply to gain these skills are: Azure Data Lake, Hadoop, HDInsight, Spark, Azure data factory, Azure Stream Analytics

Below is the course outline :

  • 10 COURSES  |  12-30  HOURS PER COURSE  |  8  SKILLS
  • ENROLL NOW here
  • More details here

Front End Development Track:

This track provides you necessary skills to get started with Advanced Front End development using HTML5, CSS3, JavaScript, AngularJS and Bootstrap.  At the end of the curriculum you will become master in Front End Development with all predominant modern web technologies.

So if you are a front end UI developer, this is something you can try out to enhance your skills.

Below is the course outline :

  • 13 COURSES  |  15-30 HOURS PER COURSE  |  11 SKILLS
  • ENROLL NOW here
  • More details  here

Track detail

Each course runs for three months and starts at the beginning of a quarter. January—March, April—June, July—September, and October —December. The capstone runs for four weeks at the beginning of each quarter: January, April, July, October. For exact dates for the current course run, please refer to the course detail page on edX.org. 

[Microsoft]

Introduction to Data Science

June 3, 2017 Analytics, Big Data, Big Data Analytics, Big Data Management, Cloud Computing, Cold Path Analytics, Data Analytics, Data Collection, Data Hubs, Data Science, Data Scientist, Edge Analytics, Emerging Technologies, Hot Path Analytics, Human Computer Interation, Hype vs. reality, Industrial Automation, Internet of Nano Things, Internet of Things, IoT, IoT Devices, Keyword Analysis, KnowledgeBase, Machine Learning(ML), machine-to-machine (M2M), Machines, Predictive Analytics, Predictive Maintenance, Realtime Analytics, Robotics, Sentiment Analytics, Stream Analytics No comments

We all have been hearing the term Data Science and Data Scientist occupation become more popular these days. I thought of sharing some light into this specific area of science, that may seem interesting for rightly skilled readers of my blog. 

Data Science is one of the hottest topics on the Computer and Internet  nowadays. People/Corporations have gathered data from applications and systems/devices until today and now is the time to analyze them. The world wide adoption of Internet of Things has also added more scope analyzing and operating on the huge data being accumulated from these devices near real-time.

As per the standard Wikipedia definition goes Data science, also known as data-driven science, is an interdisciplinary field about scientific methods, processes and systems to extract knowledge or insights from data in various forms, either structured or unstructured, similar to data mining.”.

Data Science requires the following skillset:

  • Hacking Skills
  • Mathematics and Statistical Knowledge
  • Substantive Scientific Expertise

aoz1BJy

[Image Source: From this article by Berkeley Science Review.]

Data Science Process:

Data Science process involves collecting row data, processing data, cleaning data, data analysis using models/algorithms and visualizes them for presentational approaches.  This process is explained through a visual diagram from Wikipedia.

Data_visualization_process_v1

[Data science process flowchart, source wikipedia]

Who are Data Scientist?

Data scientists use their data and analytical ability to find and interpret rich data sources; manage large amounts of data despite hardware, software, and bandwidth constraints; merge data sources; ensure consistency of datasets; create visualizations to aid in understanding data; build mathematical models using the data; and present and communicate the data insights/findings.

They are often expected to produce answers in days rather than months, work by exploratory analysis and rapid iteration, and to produce and present results with dashboards (displays of current values) rather than papers/reports, as statisticians normally do.

Importance of Data Science and Data Scientist:

“This hot new field promises to revolutionize industries from business to government, health care to academia.”

The New York Times

Data Scientist is the sexiest job in the 21st century as per Harward Business Review.

McKinsey & Company projecting a global excess demand of 1.5 million new data scientists.

What are the skills required for a Data Scientist, let me share you a visualization through a Brain dump.

FxsL3b8

I thought of sharing an image to take you through the essential skill requirements for a Modern Data Scientist.

So what are you waiting for?, if you are rightly skilled get yourselves an Data Science Course.

Informational  Sources:

Getting Started local development with Azure Cosmos DB services – Part 2

May 29, 2017 .NET, .NET Core 1.0, .NET Core 1.0.1, .NET Framework, ASP.NET, Azure, Azure SDK Tools, Azure Tools, Cloud Computing, CodeSnippets, CosmosDB, Document DB, Microsoft, PaaS, SaaS, Visual Studio 2015, Visual Studio 2015 Update 3, Visual Studio 2017, VisualStudio, VS2015, VS2017, Windows, Windows 10, Windows Azure Development, Windowz Azure No comments

In my previous article we discussed about setting local development environment using Cosmos DB Emulator for Windows. With this part 2 of the article, we will cover developing, debugging and integration related aspects of using Cosmos DB Emulator.

Developing with Cosmos DB Emulator

Once you have Cosmos DB emulator installed and running on your machine, you can use any supported Cosmos DB SDK or Cosmos DB REST API to interact with emulator. This process is same as you are using a Cosmos DB cloud service.

Cosmos DB Emulator also provides a build-in visual explorer through which you can view,create and edit collections and documents.

image

Before you integrate Cosmos DB SDK or Cosmos DB REST API you would need to generate master key for authentication. Unlike cloud service, Cosmos DB emulator only support single fixed account and master key.  You would not be able to communicate with Emulator without this master key.

Default Master Key:

Account name: localhost:<port>

Account key: C2y6yDjf5/R+ob0N8A7Cgv30VRDJIWEHLM+4QDU5DE2nQ9nDuVTqobD4b8mGGyPMbIZnqyMsEcaGQy67XIw/Jw==

PS: This key is only to be used in Emulator. You cannot use the same key for Production(Cosmos DB Cloud Service).

Furthermore, if you want to set your own key. You can go to command line references and run DocumentDB.Emulator.exe with sufficient command switch to set your own key. Remember it should meet the key security requirements. See command-line tool reference for more information.

The Azure Cosmos DB Emulator is installed by default to the C:\Program Files\Azure Cosmos DB Emulator  or C:\Program Files\DocumentDB Emulator  directory.

Once you have account name and key, you are good to go with development and debugging using Azure Cosmos DB emulator.

Let us start looking at how to use CosmosDB SDK. Once you add Cosmos DB SDK for .NET from NUGET sources. You would need to import the following namespaces to reference necessary classes.

 using Microsoft.Azure.Documents;
   
 using Microsoft.Azure.Documents.Client;
   
 using Microsoft.Azure.Documents.Linq;

Simple code to establish connection:

// Connect to the Azure Cosmos DB Emulator running locally use DocumentClient class in : 
DocumentClient client = new DocumentClient(
    new Uri("https://localhost:8081"), 
    "C2y6yDjf5/R+ob0N8A7Cgv30VRDJIWEHLM+4QDU5DE2nQ9nDuVTqobD4b8mGGyPMbIZnqyMsEcaGQy67XIw/Jw==");

In the above code block we are directly embedding endpoint, key in the source code.But as a suggested approch keeping in mind to easily point to production service would be maintain the key in Web.config appSettings.

   <add value="https://localhost:8081/" key="endpoint"/>
    <add value="C2y6yDjf5/R+ob0N8A7Cgv30VRDJIWEHLM+4QDU5DE2nQ9nDuVTqobD4b8mGGyPMbIZnqyMsEcaGQy67XIw/Jw==" key="authKey"/>
 

Add NuGet reference to Microsoft.Azure.DocumentDB  (always use the latest version of the library)

image

For the ease of this article, I am going to use the existing ToDoList sample from DocumentDB Samples provided by Microsoft. You can originally find the same source from C:\Program Files\DocumentDB Emulator\Packages\DataExplorer\quickstart.

image

Copy and Unzip DocumentDB-Quickstart-DotNet.zip and open todo.sln in Visual Studio 2017 and your solution structure will look like below:

image

Now run the application in your Visual Studio.

1. You will see an initial screen:

image

2. Click on Create New:

image

3. New record will be added to your Azure Cosmos DB Emulator:

image

4. To verify in Cosmos DB emulator now open Cosmos DB explorer, click on Collections and Select ToDoList

image

5.Expand Documents and select item with id:da305da3-c1dc-4e34-94d9-fd7f82d26c58

image

Hope this article was helpful for you with initial development.  Share your feedback through comments and share this to your friends and colleagues.

Useful Links: