April 25, 2017
AMQP, Analytics, Azure, Azure IoT Suite, Cloud Computing, Cloud Services, Cloud to Device, Communication Protocols, Connected, Connectivity, Device to Cloud, Emerging Technologies, HTTP 1.1, Identity of Things (IDoT), Intelligent Cloud, Intelligent Edge, Internet of Things, IoT, IoT Central, IoT Devices, IoT Edge, IoT Hub, IoT Privacy, IoT Security, Machines, MQTT, PaaS, SaaS, Stream Analytics
Microsoft has today released their IoT SaaS offering for customers and partners called as “Microsoft IoT Central”. IoT Central enables powerful IoT scenarios without requiring cloud solution expertise and also simplifies the development process and makes customers to make quick time to market solutions, making digital transformation more accessible to everyone without overhead of implement solutions end to end.
As per Microsoft :
“IoT Central provides an easier way to create connected products that propel digital business. Take the complexity out of the Internet of Things (IoT) with a true, end-to-end IoT software as a service (SaaS) solution in the cloud that helps you build, use, and maintain smart products.”
Benefits of IoT Central:
- Proven platform and technology with enterprise grade security.
- Reduced complexities of setting up and maintaining IoT infrastructure and solutions.
- Building smart connected products with lesser cost and lesser overhead would ensure higher customer satisfaction.
- Quickly adapt to changing environments.
For those would need control on implementing end to end can still choose the PaaS solution Azure IoT Suite.
Below is a picture from @JanakiramMSV’s article from forbes.com, to help you have a high level look at all the IoT offerings from Microsoft.
January 7, 2017
Analytics, Azure, Azure SDK, Cloud Computing, Communication Protocols, Contrained Networks/Devices, Data Collection, Data Integration, Emerging Technologies, Identity of Things (IDoT), Internet of Things, Interoperability, IoT, PaaS, Performance, Predictive Analytics, Predictive Maintenance, Realtime Analytics, Reliability, Scalability, Self Driven Cars, Solutions, Stream Analytics, Tech-Trends, Windowz Azure
Microsoft Azure IoT Suite Provisioned solutions will help you create your own fully integrated solutions tailored for your specific needs in the following 3 sections. Using these ready to consume solutions will accelerate your time to market IoT(Internet of Things) requirements.
- Remote Monitoring – Connect and monitor your devices to analyze untapped data and improve business outcomes by automating processes. For ex: As a car manufacturing company, provide an option to customer to remotely monitor their car condition, and suggest if they need a re-fuel or oil change.
- Connected Factory – Anticipate maintenance needs and avoid unscheduled downtime by connecting and monitoring your devices. For ex: As a car manufacturing factory and spare parts are essential for car manufacturing. Automated solutions can ensure timely availability of necessary spare parts inventory to meet daily, weekly or monthly manufacturing needs.
- Predictive Maintenance – Connect and monitor your factory industrial devices for insights using OPC UA to drive operational productivity. For ex: As a car service support, you can get near real-time performance data from the cars manufactured by your company, predict the health of each components in a car and offer timely maintenance to their cars. Send real-time reminders and notifications to customers. Their by ensuring higher satisfaction levels for customers and more business value to the organization as it attracts more sales and good customer feedback.
These solutions will help you to:
Connect and scale quickly – Use preconfigured solutions, and accelerate the development of your Internet of Things (IoT) solution.
Analyze and process data – Collect previously untapped data from devices and sensors, and use built-in capabilities to visualize—and act on—that data.
Integration and Digital Transformation – Easily integrate with your systems and applications, including Salesforce, SAP, Oracle Database, and Microsoft Dynamics, making it simple to access your data and keep your disparate systems up to date.
Enhanced security – Set up individual identities and credentials for each of your connected devices—and help retain the confidentiality of both cloud-to-device and device-to-cloud messages.
December 11, 2016
Azure, Cloud Computing, Cloud to Device, Communication Protocols, Connectivity, Contrained Networks/Devices, Data Hubs, Device Shadow, Device to Cloud, Device Twin, Emerging Technologies, Event Hubs, HTTP2, Identity of Things (IDoT), Intelligent Cloud, Internet of Things, Interoperability, IoT, IoT Hub, IoT Privacy, IoT Security, Messaging, Microsoft, Performance, Protocols, Reliability, Scalability, Tech-Trends
With this article I am trying to provide you a birds eye view comparison of IoT Hub and Azure Event Hub, so that some of you may stop feeling that there is nothing new in IoT Hub.
For the interest of this article, I put together a table with side-by-side comparison of some important features/desired features from an IoT Hub like platform.
||Supports both device-to-cloud and cloud-to-device bidirectional communication
||Supports only device-to-cloud communication
||Can maintain device state using Device Twins and query them whenever needed.
||AMQP 1.1, AMQP over Web Sockets, MQTT 3.2, MQTT over Web Sockets, HTTP 1.1, Web Sockets.
||AMQP 1.1, AMQP over Web Sockets, HTTP 11 , Web Sockets only
||Provides IoT protocol gateway a customizable implementation for industrial protocol channelling.
||Provides identity to each device and easily revocable through IoT Hub Device Management portal.
||Shared access policies with limited revocation capabilities are provided.
||Provides a rich set of features through Device Management capability. Includes individually enable/disable or provision new device. Change security keys as needed. View/identify individual device problems easily.
||Does not provide individual performance metrics. Can provide only a high level aggregated metrics only.
||Scalable to thousands/millions of simultaneous devices
||Limited number of simultaneous connections up to 5000 connections per Azure Service Bus Quotas. Event Hub provides a capability to partition your message to channel it in to associated Service Bus quotas.
|SDK Support/ Developer Support
||Provides very good Integration SDK and developer support. Both Azure IoT Device SDK and IoT Gateway SDK are the most essential kits provided for almost all devices/OS platforms. It also support all the latest programming languages such as C#, Node.js, Java and Python.
Also provides direct MQTT, AMQP and REST based HTTP APIs.
Very detail oriented documentation provided.
|.NET, Java and C apart from protocols such as AMQP, HTTP API interfaces.
|Files/Images Upload Capability
||Supports IoT devices/solutions to upload files/images/snapshots to cloud and define a workflow for processing them.
||Very decent message routing capability is available out of the box. Up to 10 end points can be defined and Advanced Rules can be defined on how routing should occur.
||Requires additional programming and hosting to support as per the need.
From this comparison table, you can analyse that IoTHub is the right candidate for your IoT solution needs, as Event Hub lacking certain capabilities that are essential for an IoT Ingestion point. If you are only requiring to send messages to cloud and doesn’t require any fancy stuff as IoTHub provides, you can choose Event Hub.
Remember with more power comes more responsibility, that’s what IotHub intend to provide to you.
Hope this overview was helpful. Please feel free to comment or initiate a discussion any time. Please share your feedbacks on this article as well.
Event Hubs is a feature within the Azure and is intended to help with the challenge of handling an event based messaging at huge scale. To be specific it is a Highly scalable data streaming platform.
The idea is that if you have apps or devices publishing telemetry events then Event Hubs can be the ingestion point and your can send/push messages to Event Hub. Under the hood Event Hub will create a stream of all of these events which can be read at any time through different ways. This processing of events can happen through Stream processing or direct, and push them for Real-time Analytics or processed message can be stored in to Cold storage for doing historical analytics on your data.
- Event Hubs can ingest and process messages at larger scale, such as millions of messages per second.
- Provides Publish/Subscribe communication capabilities
- Support for AMQP and HTTP protocols
- SAS token based authentication to identify and authenticate event publisher.
- Scalable Through-put units, purchased as needed.
To read more about Event Hubs visit here
October 1, 2016
Architecture, Azure, Cloud Computing, Cloud Services, Horizontal Scaling, Performance, Reliability, Resilliancy, Scalability, Scale Down, Scale In, Scale Out, Scale Up, Software/System Design, Vertical Scaling, Virtualization
When you work with Cloud Computing or normal Scalable highly available applications you would normally hear two terminologies called Scale Out and Scale Up or often called as Horizontal Scaling and Vertical Scaling. I thought about covering basics and provide more clarity for developers and IT specialists.
What is Scalability?
Scalability is the capability of a system, network, or process to handle a growing amount of work, or its potential to be enlarged to accommodate that growth. For example, a system is considered scalable if it is capable of increasing its total output under an increased load when resources (typically hardware) are added.
A system whose performance improves after adding hardware, proportionally to the capacity added, is said to be a scalable system.
This will be applicable or any system such as :
- Commercial websites or Web application who have a larger user group and growing frequently,
- or An immediate need to serve a high number of users for some high profile event or campaign.
- or A streaming event that would need immediate processing capabilities to serve streaming to larger set of users across certain region or globally.
- or A immediate work processing or data processing that requires higher compute requirements that usual for a certain job.
Scalability can be measured in various dimensions, such as:
- Administrative scalability: The ability for an increasing number of organizations or users to easily share a single distributed system.
- Functional scalability: The ability to enhance the system by adding new functionality at minimal effort.
- Geographic scalability: The ability to maintain performance, usefulness, or usability regardless of expansion from concentration in a local area to a more distributed geographic pattern.
- Load scalability: The ability for a distributed system to easily expand and contract its resource pool to accommodate heavier or lighter loads or number of inputs. Alternatively, the ease with which a system or component can be modified, added, or removed, to accommodate changing load.
- Generation scalability: The ability of a system to scale up by using new generations of components. Thereby, heterogeneous scalability is the ability to use the components from different vendors.
Scale-Out/In / Horizontal Scaling:
To scale horizontally (or scale out/in) means to add more nodes to (or remove nodes from) a system, such as adding a new computer to a distributed software application.
- Load is distributed to multiple servers
- Even if one server goes down, there are servers to handle the requests or load.
- You can add up more servers or reduce depending on the usage patterns or load.
- Perfect for highly available web application or batch processing operations.
- You would need additional hardware /servers to support. This would increase increase infrastructure and maintenance costs.
- You would need to purchase additional licenses for OS or required licensed software’s.
To scale vertically (or scale up/down) means to add resources to (or remove resources from) a single node in a system, typically involving the addition of CPUs or memory to a single computer.
- Possibility to increase CPU/RAM/Storage virtually or physically.
- Single system can serve all your data/work processing needs with additional hardware upgrade being done.
- Minimal cost for upgrade
- When you are physically or virtually maxed out with limit, you do not have any other options.
- A crash could cause outages to your business processing jobs.
We discussed in detail about the both approach in Scalability, depending on the need you will have to choose right approach. Nowadays high availability of cloud computing platforms like Amazon AWS/Microsoft Azure etc., you have lots of flexible ways to Scale-Out or Scale-Up on a Cloud environment, which provides you with virtually unlimited resources, provided you are being capable to pay off accordingly.
Hope this information was helpful, please leave your comments accordingly if you find any discrepancies or you have any queries.
August 13, 2016
.NET, ASP.NET, Azure, Cloud Computing, Data Caching, Data Hubs, Emerging Technologies, KnowledgeBase, Microsoft, Performance, Redis Cache, Windows Azure Development
Azure Redis Cache, a secure data cache based on Open source Redis Cache, which will provide you a fully managed/serviced instance from Microsoft. Means you don’t have to bear the burden of managing the server/software patches etc..
What is Redis Cache?
Redis is an open source (BSD licensed), in-memory data structure store, used as a database, cache and message broker. It supports data structures such as strings, hashes, lists, sets, sorted sets with range queries, bitmaps, hyperloglogs and geospatial indexes with radius queries. Redis has built-in replication, Lua scripting, LRU eviction, transactions and different levels of on-disk persistence, and provides high availability via Redis Sentinel and automatic partitioning with Redis Cluster.
You can run atomic operations on these types, like appending to a string; incrementing the value in a hash; pushing an element to a list; computing set intersection, union and difference; or getting the member with highest ranking in a sorted set.
In order to achieve its outstanding performance, Redis works with an in-memory dataset. Depending on your use case, you can persist it either by dumping the dataset to disk every once in a while, or by appending each command to a log. Persistence can be optionally disabled, if you just need a feature-rich, networked, in-memory cache.
Redis also supports trivial-to-setup master-slave asynchronous replication, with very fast non-blocking first synchronization, auto-reconnection with partial resynchronization on net split.
5 High-level Use Cases of Redis Cache
1. Session Cache
One of the most apparent use cases for Redis is using it as a session cache. The advantages of using Redis over other session stores, such as Memcached, is that Redis offers persistence. You can maintain your applications user, role and authorization permission lists etc in Redis Cache for faster accessibility.
2. Full Page Cache (FPC)
Outside of your basic session tokens, Redis provides a very easy FPC platform to operate in. Going back to consistency, even across restarts of Redis instances, with disk persistence your users won’t see a decrease in speed for their page loads
Taking advantage of Redis’ in memory storage engine to do list and set operations makes it an amazing platform to use for a message queue. Interacting with Redis as a queue should feel native to anyone used to using push/pop operations with lists in programming languages such as C#, Python, Java, Php etc.
Redis does an amazing job at increments and decrements since it’s in-memory. Sets and sorted sets also make our lives easier when trying to do these kinds of operations, and Redis just so happens to offer both of these data structures.
The use cases for Pub/Sub are truly boundless. You can use it for social network connections, for triggering scripts based on Pub/Sub events, and even a chat system built using Redis Pub/Sub!
Finally let us come to context of this blog to take you to essential pricing model from Microsoft:
Azure Redis Cache is available in three tiers:
- Basic—Single node, multiple sizes, ideal for development/test and non-critical workloads. The Basic tier has no SLA.
- Standard—A replicated cache in a two-node primary/secondary configuration managed by Microsoft, with a high-availability SLA.
- Premium—All of the Standard tier features, including a high-availability SLA, as well as better performance over Basic and Standard-tier caches, bigger workloads, disaster recovery, redis persistence, redis cluster, enhanced security and isolation through Virtual Network Deployment.
- ** Basic and Standard caches are available in sizes up to 53 GB(250 MB, 1 GB, 2.8 GB, 6 GB, 13 GB, 26 GB, 53 GB. )
- ** Premium caches are available in sizes up to 530 GB with more on request.