Friday, January 17

Microservices Training | Microservices Docker Example | Microservices Tu...

Micro-services are self-contained, independent application units that each fulfill only one specific business function, so they can be considered small applications in their own right. What will happen if you decide to build several micro-service with different technology stacks? Your team will soon be in trouble as developers have to manage even more environments than they would with a traditional monolithic application.
The solution is: using micro-service and containers to encapsulate each micro-service. Docker is a tool that helps you manage those containers.

I am sharing a free youtube tutorial that we used to train our team developing micro-service.

https://youtu.be/UWl7X2fUWTM?t=21

Another tutorial on Microservices Docker, Kubernetes  




Sunday, January 12

Database design for Microservices

History of Microservices - 

A workshop of software architects held near Venice in May 2011 used the term "microservice" to describe what the participants saw as a common architectural style that many of them had been recently exploring.In May 2012, the same group decided on "microservices" as the most appropriate name. James Lewis presented some of those ideas as a case study in March 2012 at 33rd Degree in Kraków in Micro services - Java, the Unix Way as did Fred George about the same time. Adrian Cockcroft, former director for the Cloud Systems at Netflix, described this approach as "fine grained SOA", pioneered the style at web scale, as did many of the others mentioned in this article - Joe Walnes, Dan North, Evan Bottcher and Graham Tackley. <Source : Wikipedia>

Accepted Definition of Microservices - 

Microservices are a software development technique - a variant of the service-oriented architecture (SOA) structural style— that arranges an application as a collection of loosely coupled services. In a microservices architecture, services are fine-grained and the protocols are lightweight.



 Database design for Microservices

The main idea behind microservices architecture is that some types of applications become easier to build and maintain when they are broken down into smaller, composable pieces which work together. The main benefit of the microservices architecture is that it improves agility and reduced development time.When you  correctly decompose a system into microservices, you can develop and deploy each microservice independently and in parallel with the other services.


In order to be able to independently develop microservices , they must be loosely coupled. Each microservice’s persistent data must be private to that service and only accessible via it’s API . If two or more microservices were to share persistent data then you need to carefully coordinate changes to the data’s schema, which would slow down development.
There are a few different ways to keep a service’s persistent data private. You do not need to provision a database server for each service. For example,  if you are using a relational database then the options are:
  • Private-tables-per-service – each service owns a set of tables that must only be accessed by that service
  • Schema-per-service – each service has a database schema that’s private to that service
  • Database-server-per-service – each service has it’s own database server.
Private-tables-per-service and schema-per-service have the lowest overhead.  Using a schema per service is ideal since it makes ownership clearer. For some applications, it might make sense for database intensive services to have their own database server.
It is a good idea to create barriers that enforce this modularity. You could, for example, assign a different database user id to each service and use a database access control mechanism. Without some kind of barrier to enforce encapsulation, developers will always be tempted to bypass a service’s API and access it’s data directly.It might also make sense to have a polyglot persistence architecture. For each service you choose the type of database that is best suited to that service’s requirements. For example, a service that does text searches could use ElasticSearch. A service that manipulates a social graph could use Neo4j. It might not make sense to use a relational database for every service.
There are some downsides to keeping a service’s persistent data private. Most notably, it can be challenging to implement business transactions that update data owned by multiple services. Rather than using distributed transaction, you typically must use an eventually consistent, event-driven approach to maintain database consistency.
Another problem, is that it is difficult to implement some queries because you can’t do database joins across the data owned by multiple services. Sometimes, you can join the data within a service. In other situations, you will need to use Command Query Responsibility Segregation (CQRS) and maintain denormalizes views.
Another challenge is that  services sometimes need to share data. For example, let’s imagine that several services need access to user profile data. One option is to encapsulate the user profile data with a service, that’s then called by other services. Another option is to use an event-driven mechanism to replicate data to each service that needs it.
In summary,  it is important that each service’s persistent data is private. There are, however,  a few different ways to accomplish this such as a schema-per-service. Some applications benefit from a polyglot persistence architecture that uses a mixture of database types.  A downside of not sharing databases is that maintaining data consistency and implementing queries is more challenging.

Wednesday, January 1

How is AWS Lambda used in Localytics?

How is AWS Lambda used in Localytics?

Localytics is a Boston-based, web and mobile app analytics and engagement company. Its marketing and analytics tools are being extensively used by some major brands, such as ESPN, eBay, Fox, SalesForce and The New York Times, to understand and evaluate the performance of their apps and to engage with the existing as well as the new customers.

Use case in LocalyticsThe software developed by Localytics is employed in more than 37,000 apps on more than 3 billion devices all around the world.

Regardless of how popular Localytics is now, Localytics had faced some serious challenges before they started using Lambda.

Let’s see what the challenges were before we discuss how Lambda came to the rescue and helped Localytics overcome these challenges.

Challenges

  • Billions of data points uploaded every day from different mobile applications running Localytics analytics software are fed to the pipeline that they support.
  • Additional capacity planning, utilization monitoring, and infrastructure management were required since the engineering team had to access subsets of data in order to create new services.
  • The platform team was more inclined toward enabling self-service for engineering teams.
  • Every time a microservice was added, the main analytics processing service for Localytics had to be updated.


   

  • Localytics now uses AWS to send about 100 billion data points monthly through Elastic Load Balancing where ELB helps in distributing the incoming application traffic across multiple targets.
  • Afterward, it goes to Amazon Simple Queue Service where it enables us to decouple and scale microservices, distributed systems, and serverless applications.
  • Then, it reaches Amazon Elastic Compute Cloud and, finally, into an Amazon Kinesis stream that makes it easy to collect, process, and analyze real-time, streaming data so that we can get timely insights and can react quickly to new information.
  • With the help of AWS Lambda, a new microservice is created for each new feature of marketing software to access Amazon Kinesis. Microservices can access data in parallel.


 

Friday, December 6

Walking In The Cloud


I have been discussing Cloud Strategy with a group of architects and once again it became evident that most people still seem to think Cloud is just another product that you can adopt without really need for a enterprise wide Cloud Strategy. It is not as simple as selecting one cloud service provider from Amazon Web Services (AWS) , Google, Microsoft Azure, Salesforce, and IBM.. Enterprises should take help of Cloud Architect  or else take help from the Cloud service provider to define it's Cloud Strategy. For example if you are considering AWS then AWS architect will provide architecture guidance to the enterprise to define its cloud strategy. Each vendor has multiple products offerings for it's Cloud and it is important to understand how Cloud Service Provider will replace your hardware infrastructure and how it will provide a robust solution.

Key points to decide the cloud strategy
  • How data is going to be stored on the cloud and how you will save data center expenses?
  • How cloud will enable application to scale on demand ?
  • How Cloud Capacity planning is different from owned infrastructure capacity planning? 
  • How enterprise will save capital expenses by way of Pay-As-You-Use variable expense
  • How cloud will increase speed of delivery by way of using API and reusable services
  • How business will become more agile by moving to cloud based architecture
  • What are the additional features of individual cloud service provider that differentiate it from others

Many enterprises still lack clarity about their cloud strategy because they are not familiar with the Cloud offerings. An enterprise that wants to adopt the Cloud across all it's business units must have a mature and well-formed understanding of its Enterprise Architecture and a clear view of it's components. Enterprise Architecture Planning enables an enterprises to build structural foundations to support proposed business strategies. It captures the vision of an enterprise by integrating its dimensions to contextualize transformation strategies, organizational structures, business capabilities, data pools, IT applications, and all technology objects. Every business unit of an enterprise is subject to change, and each change may have significant consequences throughout organizational domains.

Cloud Computing is a paradigm to decentralize data centers, by visualizing both infrastructure and platform, and enable services using the internet. It gives access to platforms, services, and tools from browsers deployed across millions of terminals. As well, it reduces the management and maintenance of all the resources associated to technology and infrastructure while providing dynamism, independence, portability, usability, and salability of platform tools.
 

Sunday, November 10

Android Error Solution / Solved : Unable to detect adb version, exit value 0xc0000135

I have Windows 10 license but I prefer using Windows 8.1 on my development laptop. Recently I refreshed my Windows 8.1 and reinstalled Android Studio 3.5.2 and I started getting the following error as soon as I open the Android Studio 

The error basically happens because I have a fresh Windows 8.1 with all required updates but Windows requires  'Windows Universal C Runtime' which gets installed as part of Widows update and because of reinstall of Windows the update is missing. As mentioned in error text 'Android Studio' recommended solution is to download the C run-time package from Microsoft support website (URL is in the image below) or else you can download the 3 ADB specific files from internet and add them to your platform-tools folder on your Windows machine.

Android Studio Error : Unable to detect adb version, exit value 0xc0000135





 


Solution:

  1. Before you begin implementing changes backup the  platform-tools folder on your machine
  2. Add the platform-tools directory path into the system path environment variable.
  3. Replace the following 3 files in your platform_tools directory in the path
    (C:\Users\{YourAccount}\AppData\Local\Android\Sdk\platform-tools) As you might know the (Your Account) has to be replaced with your Windows Machine name. If you machine name is DataScience, YourAccount should be replaced by DataScience in the path. 
  4. And this is image of the 3 files that I downloaded and replaced in my platform-tools folder
  5. Restart Android Studio and the error should be fixed

Tuesday, November 5

Digital Medicine - Future of Research & Medical Care

The modern practice of scientific medicine depends on the existence of the written and printed information to store medical information. New digital tools can't just record clinical data, they can also generate medical intelligence by analyzing historical data. This leap of industry into "digital medicine" is potentially precise, effective, widely distributed & available to more people than the current medical practice. Critical steps in the creation of Digital Medicine are  analysis of the impact of new technologies & coordinated efforts to direct technological development towards creating a new paradigm of medical care. So Digital Technology can be used in two areas in medicine, to aid research and for medical care.


3D modelling is used to produce precise representations of anatomies in patients. This enables medical teams to plan and visualize complex surgeries & to produce life-saving implants and prostheses cuastomized for individual patients. It’s a remarkable evolution that is  having a tremendous impact on patient’s lives.However, this is only the tip of the iceberg. The true revolution in medicine and medical care in the 21st century will not come from such physical models, but from virtual ones. Looking into the future, these virtual models will be able to simulate the true physiology and Pathophysiology of human beings in coming years,  changing forever the way we research, diagnose and treat injuries and disease.
While your virtual twin may seem like a distant dream, progress in bringing this dream to life is actually already well underway in the nascent field of Bio-intelligence. Bio-Intelligence uses computer technologies to model, simulate, visualize and experience biological medical processes in a virtual environment. While drug makers have for some time modeled and screened virtual proteins and compounds against medical databases, drug development and production remain largely rooted in the real world, and collaboration between disciplines and organizations has been limited.
Every day, drug makers work to produce real drugs that they test on real animals, and then on real patients in real clinical trials. And the time and money they expend is staggering. According to studies companies can expect to spend $3 billion over a period of ten years to bring a single new drug to market
Add to this challenge the dynamism and complexity of living systems, and it becomes clear that a collaborative approach to research and development, along with the use of virtual modelling and simulation, could bring enormous benefits to life science and healthcare industries. Collaboration between scientific disciplines and between pharmaceutical companies, research labs, health service providers and computer companies would allow sharing of knowledge and experience to foster insight and innovation.
And, the collaborative use of computer models and simulation would enable researchers to better understand complex systems and more accurately predict the biological effects of various medicines and treatments, enabling drug makers in turn to fine tune real-world assays and eliminate ineffective treatments from trials before the drugs are even produced.
The changing landscape of research today is forcing the bioinformatics community to seek a new level of data sharing and collaboration only made possible with new platforms.Such approaches could also open the door to truly personalized healthcare medicine as collaboratively produced models and simulations are combined with real world data from individual patients. These changes could produce significant innovation and gains in efficiency, effectiveness and safety, bringing better heath treatment outcomes to everyone.
  • 11


Saturday, October 12

Smarter BPM using Blockchain concept



Blockchain based distributed ledgers have been used to enable collaboration in a number of environments ranging from diamond trading to securities settlement. Systems ability to execute defined scripts in the form of smart contracts along with blockchain Distributed Ledger Technology makes it capable of managing inter-organizational processes. Blockchain platforms that support both DLT and smart contracts should be capable of not only hosting business data but also the rules for managing the data. Smart contracts execute code directly on the blockchain network as a series of process steps, based on an algorithm programmed to the rules of the contract and the blockchain.



Multi-party Collaboration
Smart contracts can be used to implement business collaborations both within and external to the organization. A blockchain-based real estate registry would allow banks, government agencies, buyers, and sellers to collaborate and track the progress of a process in real-time. Specific aspects of inter-organizational business processes can be compiled into rules based smart contracts to ensure that processes are correctly executed. Smart contracts can independently monitor processes, so that only valid messages are accepted and are sent only from registered process participants. Security and accountability can be factored in the contract, as well as compliance with government regulations and internal rules and processes. 

Blockchain and smart Business Process Management
Even though smart contracts are self-executing, they can play a role in business process improvement. For example, in the case of supply chains, information from blockchain-based tracking of goods and materials can be used to develop algorithms that would prevent counterfeit products or lower quality materials from entering the chain. By combining process information gathered by the smart contract, with visualized process, lean and six sigma techniques, improvements can be made to the rules governing smart contracts.

Sunday, September 8

Digital India cannot be achieved without Health Insurance Portability and Accountability Act

Let me begin by reiterating the subject line - Digital India cannot be achieved without Health Insurance Portability and Accountability Act. America revolutionized is Healthcare with computers and when it noticed there was a need for a law to ensure compliance it passed  Health Insurance Portability and Accountability Act that also defines the requirement of Digital America and Digital Healthcare for America. If Indian government wants successful Ayushmaan Bharat which is similar to Obama Care of USA it cannot be achieved without2 important foundations
1) Data Protection Law to protect the healthcare and private data of every individual
2) Health Care Accountability Law that mandates certain standard of healthcare in every hospital

For an ordinary man 'Going Digital' means primarily storing information in 'Digital Format'. For government 'Going Digital' also means guaranteeing protection of privacy for its citizen and allowing use of healthcare data in such a manner that the data is Secure, Restricted to authorized entities, ensuring data privacy and should be made available to authorized entities over secured internet with minimal efforts.

When you go to a hospital for a medical test the test reports and your personal data are stored on some hospital computer system. The hospital gives you a print of your report and maintains your medical records
for a undisclosed period of time which could be infinite.

When you go to 2nd hospital to take a 2nd opinion you have to share your paper reports with doctor because your 1st hospital does not give you access to your report over internet in more than 99% of hospitals in #India. The 2nd hospital , he may ask you do another round of test and again give your reports in paper format.

After years a person has hundreds of pages of paper report and the report format varies from hospital to hospital because India does not mandate hospitals to have a standard format for medical records - a major failure of the Indian Medical Association, Government of India and other bodies who are responsible for implementing standards in healthcare.

USA government signed the Health Insurance Portability and Accountability Act of 1996.  The HIPAA Privacy Rule is composed of national regulations for the use and disclosure of Protected Health Information (PHI) in healthcare treatment, payment and operations by covered entities. HIPPA was created primarily to
  1. modernize the flow of healthcare information, 
  2. stipulate how Personally Identifiable Information maintained by the healthcare and healthcare insurance industries should be protected from fraud and theft, 
  3. and address limitations on healthcare insurance coverage.


HIPAA was created to “improve the portability and accountability of health insurance coverage” for employees between jobs to combat waste, fraud and abuse in health insurance and healthcare delivery. The act also contained passages to promote the use of medical savings accounts by introducing tax breaks, provides coverage for employees with pre-existing medical conditions and simplifies the administration of health insurance. The procedures for simplifying the administration of health insurance became a vehicle to encourage the healthcare industry to computerize patients´ medical records. This particular part of the Act spawned the Health Information Technology for Economic and Clinical Health Act (HITECH) in 2009, which in turn lead to the introduction of the Meaningful Use incentive program – described by leaders in the healthcare industry as “the most important piece of healthcare legislation to be passed in the last 20 to 30 years”


https://www.hipaajournal.com/hipaa-history/

https://en.wikipedia.org/wiki/Health_Insurance_Portability_and_Accountability_Act

Friday, August 16

How AI in Healthcare is performing diagnosis and saving lives at NHS

A doctor can use Optical Coherence Tomography (OCT) scanners to scan an eye and detect eye diseases. OCT scanners create around 65 million data points each time they are used – mapping each layer of the retina and that's lot of data for doctor to study. DeepMind's AI claims to recognise 50 common eye problems from the OCT data - which means a doctor does not have to spend time in analyzing the data. The results of AI have been promising in the trials considering the algorithms were correct 94.5 per cent of the time, which is equal to retina specialists doctors who were using extra notes along with the OCT scans.
                                       Deepmind & Google joined force in 2014 to accelerate AI research in healthcare and built medical assistant application for the National Health Scheme.. The significant AI work done by Deepmind in diagnosing eye diseases as effectively as the world’s top doctors, to in saving 30% of the energy used to keep data centers cool & to predict the complex 3D shapes of proteins is disruptive in field of Artificial General Intelligence (AGI).
The application called Streams is a mobile phone app that aims to provide timely diagnoses using AI so that right nurse or doctor get to the right patient in time and save the lilfe of patient who would have died otherwise. Each year, many thousands of patients in UK hospitals die from conditions like sepsis and acute kidney injury (AKI), because the warning signs aren't picked up and acted on in time

Streams mobile medical assistant for clinicians has been in use at the Royal Free London NHS Foundation Trust since early 2017. The app uses the existing national AKI algorithm to flag patient deterioration, supports the review of medical information at the bedside, and enables instant communication between clinical teams. Shortly after rolling out at the Royal Free, clinicians said that Streams was saving them up to two hours a day. We also heard about patients whose treatment was escalated thanks to the timely alert by the app. Statistics show that the app saved clinicians time, improved care and reduced the number of AKI cases being missed at the hospital.


The above figure shows how the automated process in the medical app saves time and connects doctor directly to the patient with serious condition.

There has been controversy around Google taking Over NHS data when DeepMind was taken over by Google in early 2017. DeepMind, which is now owned by Google used to operate the NHS app independently until 2017. DeepMind justified the decision explaining how Google would allow the app to scale in a way that would not be possible by itself.  Earlier in 2017 the Streams app attracted controversy after the UK’s data watchdog found that the NHS had illegally handed 1.6 million patient records to DeepMind as part of a trials. DeepMind subsequently made assurances that the medical data “will never be linked or associated with Google accounts, products or services”, and that all patient data will remain under the strict control of its NHS partners. As long as DeepMind does not share or link patient data with Google it will be major achievement for NHS in providing smarter health monitoring for AKI and many more diseases. 

Link to NHS Website-  link

Wednesday, August 7

Arnold Schwarzenegger motivational speech - Do you have a vision ?

I came across this motivational speech by Arnold Schwarzenegger. It is so relevant to people as well as software. Unless you have a dream and a vision of where you want to go you may not meet your goal. Most of the time the vision is like a dream which sounds too good to be true, too difficult to realize but you have to realize that it is your dream. there is something in you that realizes that you have it in you to that wants that dream to become reality.



Long time back when I was in college I came across a book in my fathers library. I read this book by Dr. Robert Schuller titled 'Success is never ending, failure is never final' and in that book he gives real life examples of so many dreams that he realized with his power of positive thinking. When he started with a dream he did not know how to realize the dream, he did not have a plan and the dream looked impossible. After dreaming the same dream in sleep and when awake his mind could slowly start getting a vision of the possible ways to realize the dream. It was slow process, took few days and it is important to believe in yourself and not give up your dream. Those who are mentally strong continue to spend a reasonable time nurturing the dream. Dr Sculler says it is here that your motivation is tested, if you are not passionate about your dream you give up on the dream and all the successful people we know have this one quality that they did not give up on their dream and even after minor failures they reevaluated the dream , re-imbibed the faith in their dream and started again.
              The human mind has this fantastic capability of processing information even when you are not awake and there is tons of material , research papers and books written about this subject. Often you will realize that when you have a problem and can't find a solution after a few days you think of s great ideas to solve the problem. I am not a scientist and I have never done any research on the subject I am writing about but I am passionate about these theories and from personal experience I believe when you are honest about solving some problem the mind does some processing in its spare time and one fine day dumps the solution to you. You may have had this experience and wondered why didn't I think about it sooner but what you should realize it that you, your mind or your subconscious mind - whatever you may like to call it was aware of the problem, the mind was processing all the time and it was finally come out with a solution  and this is no coincidence. What I want to say is your dream, your vision, your plan, your mind, your subconscious mind are all connected and when you are motivated they work together to realize your plans.

So why is a software developer / software architect talking about Vision? Well because when you build a software you follow the same technique that you follow to plan your life.
  • You want to solve a real life problem or a business problem
  • You are able to visualize a software that will solve the problem - in your mind you see the solution
  • You can convince people why and how the software is relevant and sell them the idea
  • You know what are the risks and how you will mitigate the risks 
  • You have a vision of how this software will be designed and what technologies will be used to build it 
  • You then then create the roadmap for implementing the software
  • You make a plan to implement the prototype 
  • Once the prototype is successful you create a plan to build the software in stages
  • You monitor the software development so that things go as per the plan
If you miss any of the steps you may end up with end product that is not perfect. Like #Arnold said at the beginning you should have a vision, hunger and belief. Vision is something that is built on your knowledge. After a year you acquire more knowledge and experience and you may realize that your vision needs some changes and it is perfectly ok. Your vision is outcome of careful deliberations and thoughts and it should not change everyday but Vision can always improve when you have new insights.

   


                                              



https://www.youtube.com/watch?v=eWJVvNptHZ4

Friday, July 12

Wealth in the Historical Health Care and Clinical Data

One of the most important and crucial data for an individual is his Personal Health Record Data. Analysis of your 'Historical Personal Health Records' can help the doctor do much better analysis of your health and even predict diseases by looking at your historical medical test records.

Value of personal medical records

Assuming we have complete medical records of an individual since her/his birth, how can an individual benefit?
  1. No need to print papers reports and data in digital form is accessible over internet
  2. Complete record of every illness, the treatment and medical reports that show how person responded to various treatment drugs available in digital format
  3. Graphical representation of data can give insights about various medical parameters
  4. The health data in association with other related data like Food Habit, Exercise Routines & Environment Data can give new insight to persons health
  5. Historical Health Data can help get better & cheaper insurance coverage for a healthy person
  6. Software can be developed to monitor your health in Real Time & give smart predictions even before you notice any visible symptoms of an health issue

Early detection & treatment of many critical illness like Cancer, Heart Disease etc can give you and your doctor a head start to tackling the disease early and improve the chances of cure. The medical tests that you have done all your life is an important personal data. If the data is stored in digital format and analyzed by a software it can give key insight to you & your doctor. Unfortunately we are not in habit of maintaining our Personal Health Record Data over the time, government does not have any laws to enforce digitization of medical records and there are no standards for data privacy and data management for health care industry in India.
                           Out health care systems are driven by commercials considerations and heath care industry does not have adequate systems to maintain personal medical history from birth on wards. Even today most hospitals only give printed reports and do no felicitate digital storage of your medical records. This means your data perishes with your paper reports and cannot be analyzed for insights from your medical health history.

Importance of analyzing personal health records in disease detection

I am speaking from my own experience on how my family faced challenge to diagnose an illness of a family member and we were not able to identify some obvious deviation in CBC medical reports (Complete Blood Count) parameters because we did not have medicals records in digital format. We kept visiting a General Practitioner doctor with paper reports and doctor possibly did not analyze change in certain parameters over months. Only when we visited a specialist we came to understand the medical data that helped us diagnose the critical illness. To give an example, when a person has Blood Cancer (aka CML) the CBC reports parameters like WBC count, Hemoglobin level & platelet count change drastically. If you compare patients CBC report it can show which of the parameters have variance of more than 10% (actual variations in parameters in case of CML can be more than 100%) and that information can help the doctor do further investigations to detect blood cancer.


Digital Data, Visualization Software & Data Analysis

As a software engineer when I reviewed the situation I realized that if we had medical records in digital format, some basic knowledge of medical parameters and a software to visualize the records it would have been quite easy to identify the parameters that are deviating from normal acceptable range and it would have been much easier for the doctor to diagnose the disease. For patient with chronic illness it is critical to keep a watch over changing health parameters even before you share the data with your doctor and a simple mobile application would help the patient to monitor his/her health.So I created a CBC Monitor and distributed the application free of cost to few cancer hospitals who have given the software to their patients for maintaining and sharing their CBC Records - a test that has to be conducted every month by a CML (Chronic Myeloid Leukaemia also called Blood Cancer) patient.

Sample mobile application screen displaying CML Report in graphical format


There are few key points to note while building a software for analyzing health record of a person
  1. There are standard acceptable range of values for each medical parameter of human body
  2. Every person could have unique values for the standard medical parameters that may be lower or higher than the ideal range recommended by medical standards
  3. Every illness will cause some deviation in personal medical parameters over a period and this information can serve as a rule for software
  4. Software User Interface Design has to be user friendly

So a software that is built considering the above the points can be customized to monitor health and change in health of each individual. With incremental propagation of mobile devices it is only logical that mobile devices are the best bet for Personal Health Monitoring Software that will empower each individual to Save, Analyze & Share his medical data on the go.  Unfortunately neither the hospitals nor the government health services in #India have focused on leveraging mobile devices to empower the patient to 'Digitize' his/her medical records. I would have expected private companies in healthcare to leverage this opportunity to provide free software and storage to the public to maintain their medical records and also use the software to build customer loyalty towards their brand. What is clearly lacking is a long term vision and intent to provide better health management for people.
         
What is required today from IT service providers is to bridge the gap between people, hospitals, insurance companies by building software that can 'Save, Analyze, Share'  healthcare data for benefits all the parties.
  1. People require a free to use software for maintaining their medical records
  2. Doctors, Hospitals, Labs & Insurance companies require a software that improves customer loyalty & improves customer retention. 
  3. Healthcare industry in India needs software applications that helps think the Data and interpreting clinical and health data in a better way by digitizing the data and using sophisticated algorithm to predict illness or detect them in early stages.
  4. To initiate the Digital in healthcare the Indian government needs to pass laws to ensure hospitals are responsible for storing patient health records and also for ensuring portability of the personal record data.
  5. Government also needs to make sure that there medical governing bodies of healthcare define standards for health care data to enable health care data standardization.
  6. Finally government needs to define data privacy law for health care industry and create a monitoring body to ensure  the laws are implemented by the industry.
America implemented HIPPA in 1996 and passed a law to ensure compliance by health care industry. THe law has ensured that hospitals are accountable for maintaining every American's medical records in a standard portable digital format. every citizens  India is already 23 years late in initiating standardization and implementation of Digital Health Care and we cannot delay it any further. As of the healthcare industry the companies that provides such software service will build a new segment & set the pace for next medical revolution in #India. We know Data  is Valuable and Hisporical Health Care data is even more valuable. The reason I think the data will revolutionize healthcare in India and globally is because India is worlds 2nd most populous country and we should consider this historical health care data as 'result of voluntary drug trials' because the data gives insight to how patient responds to various drugs. So will the politicians take the advice and implement the Indian HIPPA law?





Wednesday, June 26

Microservices Architecture - Not quite Service Oriented Architecture in new bottle!

Those who are familiar with SOA have often told me ' Microservices Architecture is actually old wine in new bottle! Industry wanted a new Hype and so they came out with concept of Microservices Architecture.' Well not quite so, Microservices Architecture is a subset of Service Oriented Architecture.In fact we can even call Microservices as a SOA design pattern.

Let's take a look at the key similarities and differences of Microservices Architecture  & SOA

The concept of service is common for both the architecture. In both architectures a service has a certain responsibility, a services can be developed in various technology stacks which bring technology diversity into the architecture. In SOA the development of services can be organized within multiple teams, however, each team needs to know about the common communication architecture in SOA. In Microservices Architecture, services can operate and be deployed independently of other services, unlike SOA. So, it is easier to deploy new versions of services frequently or scale a service independently.

                                                                                                                                                            


One hypothetical example (Honestly this rarely happens but we are building a case for microservices) of SOA drawback is since every service in SOA is communicating through ESB, if one of the services slow down, it could cause the ESB to be clogged up with requests for that service. On the other hand, microservices architecture is not designed around ESB and so it has better fault tolerance. For example, if there is a memory leak in one microservice then only that microservice will be affected and other microservices will not be affected.



In both architectures, developers must deal with the complexity of architecture and a distributed system. Developers must implement the inter-service communication mechanism between microservices (if the message queue is used in Microservice architectures) or within ESB and services. In SOA, services share the data storage while each service can have an independent data storage in microservices. Sharing data storage has its pros and cons. for example, the data can be re-used by between all services while it brings dependency and tightly coupling within services.

The main difference between SOA and microservices lies in the size and scope. Microservice is significantly smaller scope than what SOA is and mainly set of small(er) independently deployable service. On the other hand, an SOA can be either a monolith or it can be comprised of multiple microservices.

A Service Oriented Architecture is a software architecture pattern, that promotes reusability of services. The application components provide services to other components via a communications protocol over a network. The communication can involve either simple service or it could involve two or more services coordinating connecting services to each other. Where as Microservices is a software architecture pattern in which complex applications are composed of small, independent processes communicating with each other using language-agnostic APIs. Microservices should be independently deployable, or be able to shut-down a service when is not required in the system and that should not have any impact on other services.  

Wednesday, June 19

Where is your eCommerce data saved? How is your data being used? Is your data protected by Data Privacy laws??

Few questions every consumer should ask

  1. Where does the e-commerce transaction data & consumers personal data get stored? Does it get stored in India, in USA or some country that provides cheaper storage? How about China or Pakistan? 
  2. How is your data being used by the service providers like Mastercard, Visa, Amazon, PayTM? Are these companies sharing insights from your data with other companies?
  3. Can Indian governments monitor Data Usage if data is stored in another country? Can government enforce its Data Privacy law when eCommerce companies store data is another country?
  4. What happens to your Personal Data in case of hostile situation with the country in which your data is stored?



The Big B of Data - Indian E-commerce Market 2019


The 3 pertinent questions every eCommerce consumer and social media use should ask  - 

  1. Where does the e-commerce transaction data & consumers personal data get stored? Does it get stored in India, in USA or some country that provides cheaper storage? How about China or Pakistan? 
  2. Can Indian governments keep a watch over Data Usage if it is stored in another country ? 
  3. Can government enforce its Data Privacy law when eCommerce companies store data is another country?

Sunday, June 9

Why does India needs Data Localization Law?

Then internet revolution triggered the Data Avalanche and lead to innovations in Data Crunch Processing technologies and Data Analytics technologies. For last 10 years we are creating a totally new type of data that I like to call Event Data or Activity Data. For every activity we are creating 'Data', for example when you move from home to the office,  your mobile device is generating tons of GPS data even when you are not using the device. Data Scientist can analyze this GPS data & extract valuable insights that can be monetized. Today it can be said 'Data is much more valuable than Money! If you want to know why data is more valuable than money in today's world, then read on.

What is personal data?  What is data privacy? What is data localization ? Why should you be aware?

When you swipe your credit card or use Flip-cart or Amazon have you ever wondered how and where your financial transaction and personal information is stored? Well, most of the information ( also called data) is usually partly or completely stored in a database outside India.

For those of you who are not software experts Data is nothing but information like
1) your personal details like don, name, address, phone number, email
2) your net-banking/credit/debit card transaction details,
3) you travel details like ticket history, hotel reservations, cab payments
4) your GPS details on the date of every digital transaction
5) and  many more related details like your bank name, gas company, insurance provider etc
stored in a some computer by the Ebay, Amazon, PayTM, Visa and Master-cards of this world. This data is stored for ever and ever because it is valuable for some company. Over the time you may forget some events but your data which is also called your digital footprint will always be stored in someone's computer.

Apart from worries about who has access to your data if it is stored on overseas computer ( or Cloud which is nothing but a bunch of virtual computer) the Indian government and regulators have limited access to this data across the borders. The RBI wants to change this through its data localization laws and ensure Data is stored within Indian geography so that Indian consumers and government have sovereignty over its data. Data localization is the act of storing data on any device physically present within the borders of a country. As of now, most of the global as well as Indian companies  store the data on a cloud, outside India.
                            Reserve Bank of India's Localization law mandate that companies collecting critical data about Indian consumers must store and process them within the borders of the country. The RBI had issued a circular mandating that payments-related data collected by payments providers must be stored only in India, setting an October 15 2018 deadline for compliance. This covered not only card payment services by Visa and MasterCard but also of companies such as Paytm, WhatsApp and Google which offer electronic or digital payment services.

Why worry about data now?

From the time we started using credit-cards/debit cards/internet banking, the information about your transaction has been stored by card companies , commerce companies & banks in databases and used for 'improving business process and better understanding of the customer'.  There have been instances when some of these companies were found to be misusing the 'consumer information' without consent or knowledge of the customer/consumer. In all probability if the customer was made aware, he would not approve the way companies use his historical transaction information which is protected by privacy laws. This triggered the move by governments of many countries like USA to draft data privacy laws to protect its citizen and their electronic data. 
                               In one of my older posts I had mentioned how your so called harmless creditcard data can be used by marketing companies to predict events in your life and push advertisement to you. One Super market chain was once rumored to have predicted 'pregnancy of its woman consumers based on their digital footprint and they did not even use the social media data' (think about it when you post personal opinion on social media).  The marketing company studied the shopping pattern of one family, their software predicted that the woman was pregnant and started pushing baby product ads to the family, to the shock of the family. (You can read about big data in previous blog post The one who leverages Big Data will win the elections! ) 

It's not just your credit-card and online purchase data that can be misused!

  • People started using internet and internet connected devices are creating data 24/7.
  • One is creating data when you surf, shop, travel and when you are doing nothing in particular.
  • This data gives multidimensional insights and is invaluable to companies as well as governments.
  • Analysis of data can tell a lot about individual behavior, choices and habits
  • New age software can learn from the historical data and predict behavior and events
  • Very few people are aware of Data Privacy.
  • Not all data about a person or entity is public data. By law it is illegal to use any data that breaches a persons privacy.
The main intent behind data localization is to protect the personal and financial information of the country’s citizens and residents from foreign surveillance and give local governments and regulators the jurisdiction to call for the data when required. This aspect has gained importance after revelations of social media giant Facebook sharing user data with Cambridge Analytica, which is alleged to have influenced voting outcomes, have led to a global clamor by governments for data localization.

Why data localization is important to a country?

Data localization requires that data created within certain borders stay within them.Data localization is essential to national security because storing of data locally is expected to help law-enforcement agencies to access information that is needed for the detection of a crime or to gather evidence. Where data is not localized, the agencies need to rely on Mutual Legal Assistance Treaties (MLATs) to obtain access, delaying investigations. On-shoring global data could also create domestic jobs and skills in data storage and analytics too, as the Srikrishna report had pointed out. However, maintaining multiple local data centers may entail significant investments in infrastructure and higher costs for global companies, which is why they seem to be up in arms against these rules.At the same time I must say that global companies need to respect every country's right to Data Sovereignty.and understand the risk faced by countries when data is stored in another country. If the consumer data of 1.3 billion Indian falls in wrong hands it can be used to inflict huge damage to the country. sovereignty.

How is Data sovereignty different from Data Localization?

Data localization requires that data created within certain borders stay within them while Data Sovereignty means not only is the data stored in a designated location, but is also subject to the laws of the country in which it is physically stored. Data sovereignty. ensures that data is stored is subject to the legal protections and punishments of that country. So Data Localization is essential for ensuring Data Sovereignty. Russia’s On Personal Data Law (OPD-Law) requires the storage, update and retrieval of data of its citizens to be limited to data center within the Russian Federation.

All of us trust service providers with personal information, both on a voluntary and involuntary basis and we should have greater accountability from these firms about the end-use of this data. Data Localization will ensure that domestic law enforcement can respond more effectively to our complaints.

Understanding Generative AI and Generative AI Platform leaders

We are hearing a lot about power of Generative AI. Generative AI is a vertical of AI that  holds the power to #Create content, artwork, code...