Friday, December 6

Walking In The Cloud


I have been discussing Cloud Strategy with a group of architects and once again it became evident that most people still seem to think Cloud is just another product that you can adopt without really need for a enterprise wide Cloud Strategy. It is not as simple as selecting one cloud service provider from Amazon Web Services (AWS) , Google, Microsoft Azure, Salesforce, and IBM.. Enterprises should take help of Cloud Architect  or else take help from the Cloud service provider to define it's Cloud Strategy. For example if you are considering AWS then AWS architect will provide architecture guidance to the enterprise to define its cloud strategy. Each vendor has multiple products offerings for it's Cloud and it is important to understand how Cloud Service Provider will replace your hardware infrastructure and how it will provide a robust solution.

Key points to decide the cloud strategy
  • How data is going to be stored on the cloud and how you will save data center expenses?
  • How cloud will enable application to scale on demand ?
  • How Cloud Capacity planning is different from owned infrastructure capacity planning? 
  • How enterprise will save capital expenses by way of Pay-As-You-Use variable expense
  • How cloud will increase speed of delivery by way of using API and reusable services
  • How business will become more agile by moving to cloud based architecture
  • What are the additional features of individual cloud service provider that differentiate it from others

Many enterprises still lack clarity about their cloud strategy because they are not familiar with the Cloud offerings. An enterprise that wants to adopt the Cloud across all it's business units must have a mature and well-formed understanding of its Enterprise Architecture and a clear view of it's components. Enterprise Architecture Planning enables an enterprises to build structural foundations to support proposed business strategies. It captures the vision of an enterprise by integrating its dimensions to contextualize transformation strategies, organizational structures, business capabilities, data pools, IT applications, and all technology objects. Every business unit of an enterprise is subject to change, and each change may have significant consequences throughout organizational domains.

Cloud Computing is a paradigm to decentralize data centers, by visualizing both infrastructure and platform, and enable services using the internet. It gives access to platforms, services, and tools from browsers deployed across millions of terminals. As well, it reduces the management and maintenance of all the resources associated to technology and infrastructure while providing dynamism, independence, portability, usability, and salability of platform tools.
 

Sunday, November 10

Android Error Solution / Solved : Unable to detect adb version, exit value 0xc0000135

I have Windows 10 license but I prefer using Windows 8.1 on my development laptop. Recently I refreshed my Windows 8.1 and reinstalled Android Studio 3.5.2 and I started getting the following error as soon as I open the Android Studio 

The error basically happens because I have a fresh Windows 8.1 with all required updates but Windows requires  'Windows Universal C Runtime' which gets installed as part of Widows update and because of reinstall of Windows the update is missing. As mentioned in error text 'Android Studio' recommended solution is to download the C run-time package from Microsoft support website (URL is in the image below) or else you can download the 3 ADB specific files from internet and add them to your platform-tools folder on your Windows machine.

Android Studio Error : Unable to detect adb version, exit value 0xc0000135





 


Solution:

  1. Before you begin implementing changes backup the  platform-tools folder on your machine
  2. Add the platform-tools directory path into the system path environment variable.
  3. Replace the following 3 files in your platform_tools directory in the path
    (C:\Users\{YourAccount}\AppData\Local\Android\Sdk\platform-tools) As you might know the (Your Account) has to be replaced with your Windows Machine name. If you machine name is DataScience, YourAccount should be replaced by DataScience in the path. 
  4. And this is image of the 3 files that I downloaded and replaced in my platform-tools folder
  5. Restart Android Studio and the error should be fixed

Tuesday, November 5

Digital Medicine - Future of Research & Medical Care

The modern practice of scientific medicine depends on the existence of the written and printed information to store medical information. New digital tools can't just record clinical data, they can also generate medical intelligence by analyzing historical data. This leap of industry into "digital medicine" is potentially precise, effective, widely distributed & available to more people than the current medical practice. Critical steps in the creation of Digital Medicine are  analysis of the impact of new technologies & coordinated efforts to direct technological development towards creating a new paradigm of medical care. So Digital Technology can be used in two areas in medicine, to aid research and for medical care.


3D modelling is used to produce precise representations of anatomies in patients. This enables medical teams to plan and visualize complex surgeries & to produce life-saving implants and prostheses cuastomized for individual patients. It’s a remarkable evolution that is  having a tremendous impact on patient’s lives.However, this is only the tip of the iceberg. The true revolution in medicine and medical care in the 21st century will not come from such physical models, but from virtual ones. Looking into the future, these virtual models will be able to simulate the true physiology and Pathophysiology of human beings in coming years,  changing forever the way we research, diagnose and treat injuries and disease.
While your virtual twin may seem like a distant dream, progress in bringing this dream to life is actually already well underway in the nascent field of Bio-intelligence. Bio-Intelligence uses computer technologies to model, simulate, visualize and experience biological medical processes in a virtual environment. While drug makers have for some time modeled and screened virtual proteins and compounds against medical databases, drug development and production remain largely rooted in the real world, and collaboration between disciplines and organizations has been limited.
Every day, drug makers work to produce real drugs that they test on real animals, and then on real patients in real clinical trials. And the time and money they expend is staggering. According to studies companies can expect to spend $3 billion over a period of ten years to bring a single new drug to market
Add to this challenge the dynamism and complexity of living systems, and it becomes clear that a collaborative approach to research and development, along with the use of virtual modelling and simulation, could bring enormous benefits to life science and healthcare industries. Collaboration between scientific disciplines and between pharmaceutical companies, research labs, health service providers and computer companies would allow sharing of knowledge and experience to foster insight and innovation.
And, the collaborative use of computer models and simulation would enable researchers to better understand complex systems and more accurately predict the biological effects of various medicines and treatments, enabling drug makers in turn to fine tune real-world assays and eliminate ineffective treatments from trials before the drugs are even produced.
The changing landscape of research today is forcing the bioinformatics community to seek a new level of data sharing and collaboration only made possible with new platforms.Such approaches could also open the door to truly personalized healthcare medicine as collaboratively produced models and simulations are combined with real world data from individual patients. These changes could produce significant innovation and gains in efficiency, effectiveness and safety, bringing better heath treatment outcomes to everyone.
  • 11


Saturday, October 12

Smarter BPM using Blockchain concept



Blockchain based distributed ledgers have been used to enable collaboration in a number of environments ranging from diamond trading to securities settlement. Systems ability to execute defined scripts in the form of smart contracts along with blockchain Distributed Ledger Technology makes it capable of managing inter-organizational processes. Blockchain platforms that support both DLT and smart contracts should be capable of not only hosting business data but also the rules for managing the data. Smart contracts execute code directly on the blockchain network as a series of process steps, based on an algorithm programmed to the rules of the contract and the blockchain.



Multi-party Collaboration
Smart contracts can be used to implement business collaborations both within and external to the organization. A blockchain-based real estate registry would allow banks, government agencies, buyers, and sellers to collaborate and track the progress of a process in real-time. Specific aspects of inter-organizational business processes can be compiled into rules based smart contracts to ensure that processes are correctly executed. Smart contracts can independently monitor processes, so that only valid messages are accepted and are sent only from registered process participants. Security and accountability can be factored in the contract, as well as compliance with government regulations and internal rules and processes. 

Blockchain and smart Business Process Management
Even though smart contracts are self-executing, they can play a role in business process improvement. For example, in the case of supply chains, information from blockchain-based tracking of goods and materials can be used to develop algorithms that would prevent counterfeit products or lower quality materials from entering the chain. By combining process information gathered by the smart contract, with visualized process, lean and six sigma techniques, improvements can be made to the rules governing smart contracts.

Sunday, September 8

Digital India cannot be achieved without Health Insurance Portability and Accountability Act

Let me begin by reiterating the subject line - Digital India cannot be achieved without Health Insurance Portability and Accountability Act. America revolutionized is Healthcare with computers and when it noticed there was a need for a law to ensure compliance it passed  Health Insurance Portability and Accountability Act that also defines the requirement of Digital America and Digital Healthcare for America. If Indian government wants successful Ayushmaan Bharat which is similar to Obama Care of USA it cannot be achieved without2 important foundations
1) Data Protection Law to protect the healthcare and private data of every individual
2) Health Care Accountability Law that mandates certain standard of healthcare in every hospital

For an ordinary man 'Going Digital' means primarily storing information in 'Digital Format'. For government 'Going Digital' also means guaranteeing protection of privacy for its citizen and allowing use of healthcare data in such a manner that the data is Secure, Restricted to authorized entities, ensuring data privacy and should be made available to authorized entities over secured internet with minimal efforts.

When you go to a hospital for a medical test the test reports and your personal data are stored on some hospital computer system. The hospital gives you a print of your report and maintains your medical records
for a undisclosed period of time which could be infinite.

When you go to 2nd hospital to take a 2nd opinion you have to share your paper reports with doctor because your 1st hospital does not give you access to your report over internet in more than 99% of hospitals in #India. The 2nd hospital , he may ask you do another round of test and again give your reports in paper format.

After years a person has hundreds of pages of paper report and the report format varies from hospital to hospital because India does not mandate hospitals to have a standard format for medical records - a major failure of the Indian Medical Association, Government of India and other bodies who are responsible for implementing standards in healthcare.

USA government signed the Health Insurance Portability and Accountability Act of 1996.  The HIPAA Privacy Rule is composed of national regulations for the use and disclosure of Protected Health Information (PHI) in healthcare treatment, payment and operations by covered entities. HIPPA was created primarily to
  1. modernize the flow of healthcare information, 
  2. stipulate how Personally Identifiable Information maintained by the healthcare and healthcare insurance industries should be protected from fraud and theft, 
  3. and address limitations on healthcare insurance coverage.


HIPAA was created to “improve the portability and accountability of health insurance coverage” for employees between jobs to combat waste, fraud and abuse in health insurance and healthcare delivery. The act also contained passages to promote the use of medical savings accounts by introducing tax breaks, provides coverage for employees with pre-existing medical conditions and simplifies the administration of health insurance. The procedures for simplifying the administration of health insurance became a vehicle to encourage the healthcare industry to computerize patients´ medical records. This particular part of the Act spawned the Health Information Technology for Economic and Clinical Health Act (HITECH) in 2009, which in turn lead to the introduction of the Meaningful Use incentive program – described by leaders in the healthcare industry as “the most important piece of healthcare legislation to be passed in the last 20 to 30 years”


https://www.hipaajournal.com/hipaa-history/

https://en.wikipedia.org/wiki/Health_Insurance_Portability_and_Accountability_Act

Friday, August 16

How AI in Healthcare is performing diagnosis and saving lives at NHS

A doctor can use Optical Coherence Tomography (OCT) scanners to scan an eye and detect eye diseases. OCT scanners create around 65 million data points each time they are used – mapping each layer of the retina and that's lot of data for doctor to study. DeepMind's AI claims to recognise 50 common eye problems from the OCT data - which means a doctor does not have to spend time in analyzing the data. The results of AI have been promising in the trials considering the algorithms were correct 94.5 per cent of the time, which is equal to retina specialists doctors who were using extra notes along with the OCT scans.
                                       Deepmind & Google joined force in 2014 to accelerate AI research in healthcare and built medical assistant application for the National Health Scheme.. The significant AI work done by Deepmind in diagnosing eye diseases as effectively as the world’s top doctors, to in saving 30% of the energy used to keep data centers cool & to predict the complex 3D shapes of proteins is disruptive in field of Artificial General Intelligence (AGI).
The application called Streams is a mobile phone app that aims to provide timely diagnoses using AI so that right nurse or doctor get to the right patient in time and save the lilfe of patient who would have died otherwise. Each year, many thousands of patients in UK hospitals die from conditions like sepsis and acute kidney injury (AKI), because the warning signs aren't picked up and acted on in time

Streams mobile medical assistant for clinicians has been in use at the Royal Free London NHS Foundation Trust since early 2017. The app uses the existing national AKI algorithm to flag patient deterioration, supports the review of medical information at the bedside, and enables instant communication between clinical teams. Shortly after rolling out at the Royal Free, clinicians said that Streams was saving them up to two hours a day. We also heard about patients whose treatment was escalated thanks to the timely alert by the app. Statistics show that the app saved clinicians time, improved care and reduced the number of AKI cases being missed at the hospital.


The above figure shows how the automated process in the medical app saves time and connects doctor directly to the patient with serious condition.

There has been controversy around Google taking Over NHS data when DeepMind was taken over by Google in early 2017. DeepMind, which is now owned by Google used to operate the NHS app independently until 2017. DeepMind justified the decision explaining how Google would allow the app to scale in a way that would not be possible by itself.  Earlier in 2017 the Streams app attracted controversy after the UK’s data watchdog found that the NHS had illegally handed 1.6 million patient records to DeepMind as part of a trials. DeepMind subsequently made assurances that the medical data “will never be linked or associated with Google accounts, products or services”, and that all patient data will remain under the strict control of its NHS partners. As long as DeepMind does not share or link patient data with Google it will be major achievement for NHS in providing smarter health monitoring for AKI and many more diseases. 

Link to NHS Website-  link

Wednesday, August 7

Arnold Schwarzenegger motivational speech - Do you have a vision ?

I came across this motivational speech by Arnold Schwarzenegger. It is so relevant to people as well as software. Unless you have a dream and a vision of where you want to go you may not meet your goal. Most of the time the vision is like a dream which sounds too good to be true, too difficult to realize but you have to realize that it is your dream. there is something in you that realizes that you have it in you to that wants that dream to become reality.



Long time back when I was in college I came across a book in my fathers library. I read this book by Dr. Robert Schuller titled 'Success is never ending, failure is never final' and in that book he gives real life examples of so many dreams that he realized with his power of positive thinking. When he started with a dream he did not know how to realize the dream, he did not have a plan and the dream looked impossible. After dreaming the same dream in sleep and when awake his mind could slowly start getting a vision of the possible ways to realize the dream. It was slow process, took few days and it is important to believe in yourself and not give up your dream. Those who are mentally strong continue to spend a reasonable time nurturing the dream. Dr Sculler says it is here that your motivation is tested, if you are not passionate about your dream you give up on the dream and all the successful people we know have this one quality that they did not give up on their dream and even after minor failures they reevaluated the dream , re-imbibed the faith in their dream and started again.
              The human mind has this fantastic capability of processing information even when you are not awake and there is tons of material , research papers and books written about this subject. Often you will realize that when you have a problem and can't find a solution after a few days you think of s great ideas to solve the problem. I am not a scientist and I have never done any research on the subject I am writing about but I am passionate about these theories and from personal experience I believe when you are honest about solving some problem the mind does some processing in its spare time and one fine day dumps the solution to you. You may have had this experience and wondered why didn't I think about it sooner but what you should realize it that you, your mind or your subconscious mind - whatever you may like to call it was aware of the problem, the mind was processing all the time and it was finally come out with a solution  and this is no coincidence. What I want to say is your dream, your vision, your plan, your mind, your subconscious mind are all connected and when you are motivated they work together to realize your plans.

So why is a software developer / software architect talking about Vision? Well because when you build a software you follow the same technique that you follow to plan your life.
  • You want to solve a real life problem or a business problem
  • You are able to visualize a software that will solve the problem - in your mind you see the solution
  • You can convince people why and how the software is relevant and sell them the idea
  • You know what are the risks and how you will mitigate the risks 
  • You have a vision of how this software will be designed and what technologies will be used to build it 
  • You then then create the roadmap for implementing the software
  • You make a plan to implement the prototype 
  • Once the prototype is successful you create a plan to build the software in stages
  • You monitor the software development so that things go as per the plan
If you miss any of the steps you may end up with end product that is not perfect. Like #Arnold said at the beginning you should have a vision, hunger and belief. Vision is something that is built on your knowledge. After a year you acquire more knowledge and experience and you may realize that your vision needs some changes and it is perfectly ok. Your vision is outcome of careful deliberations and thoughts and it should not change everyday but Vision can always improve when you have new insights.

   


                                              



https://www.youtube.com/watch?v=eWJVvNptHZ4

Friday, July 12

Wealth in the Historical Health Care and Clinical Data

One of the most important and crucial data for an individual is his Personal Health Record Data. Analysis of your 'Historical Personal Health Records' can help the doctor do much better analysis of your health and even predict diseases by looking at your historical medical test records.

Value of personal medical records

Assuming we have complete medical records of an individual since her/his birth, how can an individual benefit?
  1. No need to print papers reports and data in digital form is accessible over internet
  2. Complete record of every illness, the treatment and medical reports that show how person responded to various treatment drugs available in digital format
  3. Graphical representation of data can give insights about various medical parameters
  4. The health data in association with other related data like Food Habit, Exercise Routines & Environment Data can give new insight to persons health
  5. Historical Health Data can help get better & cheaper insurance coverage for a healthy person
  6. Software can be developed to monitor your health in Real Time & give smart predictions even before you notice any visible symptoms of an health issue

Early detection & treatment of many critical illness like Cancer, Heart Disease etc can give you and your doctor a head start to tackling the disease early and improve the chances of cure. The medical tests that you have done all your life is an important personal data. If the data is stored in digital format and analyzed by a software it can give key insight to you & your doctor. Unfortunately we are not in habit of maintaining our Personal Health Record Data over the time, government does not have any laws to enforce digitization of medical records and there are no standards for data privacy and data management for health care industry in India.
                           Out health care systems are driven by commercials considerations and heath care industry does not have adequate systems to maintain personal medical history from birth on wards. Even today most hospitals only give printed reports and do no felicitate digital storage of your medical records. This means your data perishes with your paper reports and cannot be analyzed for insights from your medical health history.

Importance of analyzing personal health records in disease detection

I am speaking from my own experience on how my family faced challenge to diagnose an illness of a family member and we were not able to identify some obvious deviation in CBC medical reports (Complete Blood Count) parameters because we did not have medicals records in digital format. We kept visiting a General Practitioner doctor with paper reports and doctor possibly did not analyze change in certain parameters over months. Only when we visited a specialist we came to understand the medical data that helped us diagnose the critical illness. To give an example, when a person has Blood Cancer (aka CML) the CBC reports parameters like WBC count, Hemoglobin level & platelet count change drastically. If you compare patients CBC report it can show which of the parameters have variance of more than 10% (actual variations in parameters in case of CML can be more than 100%) and that information can help the doctor do further investigations to detect blood cancer.


Digital Data, Visualization Software & Data Analysis

As a software engineer when I reviewed the situation I realized that if we had medical records in digital format, some basic knowledge of medical parameters and a software to visualize the records it would have been quite easy to identify the parameters that are deviating from normal acceptable range and it would have been much easier for the doctor to diagnose the disease. For patient with chronic illness it is critical to keep a watch over changing health parameters even before you share the data with your doctor and a simple mobile application would help the patient to monitor his/her health.So I created a CBC Monitor and distributed the application free of cost to few cancer hospitals who have given the software to their patients for maintaining and sharing their CBC Records - a test that has to be conducted every month by a CML (Chronic Myeloid Leukaemia also called Blood Cancer) patient.

Sample mobile application screen displaying CML Report in graphical format


There are few key points to note while building a software for analyzing health record of a person
  1. There are standard acceptable range of values for each medical parameter of human body
  2. Every person could have unique values for the standard medical parameters that may be lower or higher than the ideal range recommended by medical standards
  3. Every illness will cause some deviation in personal medical parameters over a period and this information can serve as a rule for software
  4. Software User Interface Design has to be user friendly

So a software that is built considering the above the points can be customized to monitor health and change in health of each individual. With incremental propagation of mobile devices it is only logical that mobile devices are the best bet for Personal Health Monitoring Software that will empower each individual to Save, Analyze & Share his medical data on the go.  Unfortunately neither the hospitals nor the government health services in #India have focused on leveraging mobile devices to empower the patient to 'Digitize' his/her medical records. I would have expected private companies in healthcare to leverage this opportunity to provide free software and storage to the public to maintain their medical records and also use the software to build customer loyalty towards their brand. What is clearly lacking is a long term vision and intent to provide better health management for people.
         
What is required today from IT service providers is to bridge the gap between people, hospitals, insurance companies by building software that can 'Save, Analyze, Share'  healthcare data for benefits all the parties.
  1. People require a free to use software for maintaining their medical records
  2. Doctors, Hospitals, Labs & Insurance companies require a software that improves customer loyalty & improves customer retention. 
  3. Healthcare industry in India needs software applications that helps think the Data and interpreting clinical and health data in a better way by digitizing the data and using sophisticated algorithm to predict illness or detect them in early stages.
  4. To initiate the Digital in healthcare the Indian government needs to pass laws to ensure hospitals are responsible for storing patient health records and also for ensuring portability of the personal record data.
  5. Government also needs to make sure that there medical governing bodies of healthcare define standards for health care data to enable health care data standardization.
  6. Finally government needs to define data privacy law for health care industry and create a monitoring body to ensure  the laws are implemented by the industry.
America implemented HIPPA in 1996 and passed a law to ensure compliance by health care industry. THe law has ensured that hospitals are accountable for maintaining every American's medical records in a standard portable digital format. every citizens  India is already 23 years late in initiating standardization and implementation of Digital Health Care and we cannot delay it any further. As of the healthcare industry the companies that provides such software service will build a new segment & set the pace for next medical revolution in #India. We know Data  is Valuable and Hisporical Health Care data is even more valuable. The reason I think the data will revolutionize healthcare in India and globally is because India is worlds 2nd most populous country and we should consider this historical health care data as 'result of voluntary drug trials' because the data gives insight to how patient responds to various drugs. So will the politicians take the advice and implement the Indian HIPPA law?





Wednesday, June 26

Microservices Architecture - Not quite Service Oriented Architecture in new bottle!

Those who are familiar with SOA have often told me ' Microservices Architecture is actually old wine in new bottle! Industry wanted a new Hype and so they came out with concept of Microservices Architecture.' Well not quite so, Microservices Architecture is a subset of Service Oriented Architecture.In fact we can even call Microservices as a SOA design pattern.

Let's take a look at the key similarities and differences of Microservices Architecture  & SOA

The concept of service is common for both the architecture. In both architectures a service has a certain responsibility, a services can be developed in various technology stacks which bring technology diversity into the architecture. In SOA the development of services can be organized within multiple teams, however, each team needs to know about the common communication architecture in SOA. In Microservices Architecture, services can operate and be deployed independently of other services, unlike SOA. So, it is easier to deploy new versions of services frequently or scale a service independently.

                                                                                                                                                            


One hypothetical example (Honestly this rarely happens but we are building a case for microservices) of SOA drawback is since every service in SOA is communicating through ESB, if one of the services slow down, it could cause the ESB to be clogged up with requests for that service. On the other hand, microservices architecture is not designed around ESB and so it has better fault tolerance. For example, if there is a memory leak in one microservice then only that microservice will be affected and other microservices will not be affected.



In both architectures, developers must deal with the complexity of architecture and a distributed system. Developers must implement the inter-service communication mechanism between microservices (if the message queue is used in Microservice architectures) or within ESB and services. In SOA, services share the data storage while each service can have an independent data storage in microservices. Sharing data storage has its pros and cons. for example, the data can be re-used by between all services while it brings dependency and tightly coupling within services.

The main difference between SOA and microservices lies in the size and scope. Microservice is significantly smaller scope than what SOA is and mainly set of small(er) independently deployable service. On the other hand, an SOA can be either a monolith or it can be comprised of multiple microservices.

A Service Oriented Architecture is a software architecture pattern, that promotes reusability of services. The application components provide services to other components via a communications protocol over a network. The communication can involve either simple service or it could involve two or more services coordinating connecting services to each other. Where as Microservices is a software architecture pattern in which complex applications are composed of small, independent processes communicating with each other using language-agnostic APIs. Microservices should be independently deployable, or be able to shut-down a service when is not required in the system and that should not have any impact on other services.  

Wednesday, June 19

Where is your eCommerce data saved? How is your data being used? Is your data protected by Data Privacy laws??

Few questions every consumer should ask

  1. Where does the e-commerce transaction data & consumers personal data get stored? Does it get stored in India, in USA or some country that provides cheaper storage? How about China or Pakistan? 
  2. How is your data being used by the service providers like Mastercard, Visa, Amazon, PayTM? Are these companies sharing insights from your data with other companies?
  3. Can Indian governments monitor Data Usage if data is stored in another country? Can government enforce its Data Privacy law when eCommerce companies store data is another country?
  4. What happens to your Personal Data in case of hostile situation with the country in which your data is stored?



The Big B of Data - Indian E-commerce Market 2019


The 3 pertinent questions every eCommerce consumer and social media use should ask  - 

  1. Where does the e-commerce transaction data & consumers personal data get stored? Does it get stored in India, in USA or some country that provides cheaper storage? How about China or Pakistan? 
  2. Can Indian governments keep a watch over Data Usage if it is stored in another country ? 
  3. Can government enforce its Data Privacy law when eCommerce companies store data is another country?

Sunday, June 9

Why does India needs Data Localization Law?

Then internet revolution triggered the Data Avalanche and lead to innovations in Data Crunch Processing technologies and Data Analytics technologies. For last 10 years we are creating a totally new type of data that I like to call Event Data or Activity Data. For every activity we are creating 'Data', for example when you move from home to the office,  your mobile device is generating tons of GPS data even when you are not using the device. Data Scientist can analyze this GPS data & extract valuable insights that can be monetized. Today it can be said 'Data is much more valuable than Money! If you want to know why data is more valuable than money in today's world, then read on.

What is personal data?  What is data privacy? What is data localization ? Why should you be aware?

When you swipe your credit card or use Flip-cart or Amazon have you ever wondered how and where your financial transaction and personal information is stored? Well, most of the information ( also called data) is usually partly or completely stored in a database outside India.

For those of you who are not software experts Data is nothing but information like
1) your personal details like don, name, address, phone number, email
2) your net-banking/credit/debit card transaction details,
3) you travel details like ticket history, hotel reservations, cab payments
4) your GPS details on the date of every digital transaction
5) and  many more related details like your bank name, gas company, insurance provider etc
stored in a some computer by the Ebay, Amazon, PayTM, Visa and Master-cards of this world. This data is stored for ever and ever because it is valuable for some company. Over the time you may forget some events but your data which is also called your digital footprint will always be stored in someone's computer.

Apart from worries about who has access to your data if it is stored on overseas computer ( or Cloud which is nothing but a bunch of virtual computer) the Indian government and regulators have limited access to this data across the borders. The RBI wants to change this through its data localization laws and ensure Data is stored within Indian geography so that Indian consumers and government have sovereignty over its data. Data localization is the act of storing data on any device physically present within the borders of a country. As of now, most of the global as well as Indian companies  store the data on a cloud, outside India.
                            Reserve Bank of India's Localization law mandate that companies collecting critical data about Indian consumers must store and process them within the borders of the country. The RBI had issued a circular mandating that payments-related data collected by payments providers must be stored only in India, setting an October 15 2018 deadline for compliance. This covered not only card payment services by Visa and MasterCard but also of companies such as Paytm, WhatsApp and Google which offer electronic or digital payment services.

Why worry about data now?

From the time we started using credit-cards/debit cards/internet banking, the information about your transaction has been stored by card companies , commerce companies & banks in databases and used for 'improving business process and better understanding of the customer'.  There have been instances when some of these companies were found to be misusing the 'consumer information' without consent or knowledge of the customer/consumer. In all probability if the customer was made aware, he would not approve the way companies use his historical transaction information which is protected by privacy laws. This triggered the move by governments of many countries like USA to draft data privacy laws to protect its citizen and their electronic data. 
                               In one of my older posts I had mentioned how your so called harmless creditcard data can be used by marketing companies to predict events in your life and push advertisement to you. One Super market chain was once rumored to have predicted 'pregnancy of its woman consumers based on their digital footprint and they did not even use the social media data' (think about it when you post personal opinion on social media).  The marketing company studied the shopping pattern of one family, their software predicted that the woman was pregnant and started pushing baby product ads to the family, to the shock of the family. (You can read about big data in previous blog post The one who leverages Big Data will win the elections! ) 

It's not just your credit-card and online purchase data that can be misused!

  • People started using internet and internet connected devices are creating data 24/7.
  • One is creating data when you surf, shop, travel and when you are doing nothing in particular.
  • This data gives multidimensional insights and is invaluable to companies as well as governments.
  • Analysis of data can tell a lot about individual behavior, choices and habits
  • New age software can learn from the historical data and predict behavior and events
  • Very few people are aware of Data Privacy.
  • Not all data about a person or entity is public data. By law it is illegal to use any data that breaches a persons privacy.
The main intent behind data localization is to protect the personal and financial information of the country’s citizens and residents from foreign surveillance and give local governments and regulators the jurisdiction to call for the data when required. This aspect has gained importance after revelations of social media giant Facebook sharing user data with Cambridge Analytica, which is alleged to have influenced voting outcomes, have led to a global clamor by governments for data localization.

Why data localization is important to a country?

Data localization requires that data created within certain borders stay within them.Data localization is essential to national security because storing of data locally is expected to help law-enforcement agencies to access information that is needed for the detection of a crime or to gather evidence. Where data is not localized, the agencies need to rely on Mutual Legal Assistance Treaties (MLATs) to obtain access, delaying investigations. On-shoring global data could also create domestic jobs and skills in data storage and analytics too, as the Srikrishna report had pointed out. However, maintaining multiple local data centers may entail significant investments in infrastructure and higher costs for global companies, which is why they seem to be up in arms against these rules.At the same time I must say that global companies need to respect every country's right to Data Sovereignty.and understand the risk faced by countries when data is stored in another country. If the consumer data of 1.3 billion Indian falls in wrong hands it can be used to inflict huge damage to the country. sovereignty.

How is Data sovereignty different from Data Localization?

Data localization requires that data created within certain borders stay within them while Data Sovereignty means not only is the data stored in a designated location, but is also subject to the laws of the country in which it is physically stored. Data sovereignty. ensures that data is stored is subject to the legal protections and punishments of that country. So Data Localization is essential for ensuring Data Sovereignty. Russia’s On Personal Data Law (OPD-Law) requires the storage, update and retrieval of data of its citizens to be limited to data center within the Russian Federation.

All of us trust service providers with personal information, both on a voluntary and involuntary basis and we should have greater accountability from these firms about the end-use of this data. Data Localization will ensure that domestic law enforcement can respond more effectively to our complaints.

Sunday, May 19

Why private and government sector in #India needs to re-evaluate their Data Strategy?

                       

Data Strategy – Time to re-evaluate?

It seems a long time ago that the  3 V’s of volume, variety and velocity was unleashed on the world to describe the evolution of Big Data that organizations were about to see. For years we have been told that we needed to get ready for a new Data Tsunami . We need to be ready to store more data, take data that might not look like we had traditionally from operational systems (such as textual unstructured data) and handle data arriving more quickly.  This was in the web era before mobile and social media took off. Then we had the Big Data storm where all the V’s got bigger, faster and more diverse. When Social Media arrived and the use of external data to help make decisions became a norm the Big Data is everywhere, so much that we seem to have stopped talking about it.

An emerging ecosystem of options


 To deal with big data we needed new ways to store data. This led to the emergence of a new ecosystem of database options to support different needs. New model/schema databases were created with new query approaches to overcome gaps in what was available. Over the time most companies adopted a modified data landscape including NoSQL databases rather than adapting “Hadoop Based Data lake”.What seems to be lacking is a sound understanding of the new COMP-LEXER landscape of data-sources and databases and urgent a need to have a fresh Vision and a new road map for Enterprises Data Strategy.

When Big Data is everywhere, Big Data is just another Data

Today most organizations have stopped thinking about “Big Data” as a challenge that need to be addressed. Now it is just the data that they have to handle to meet different business requirements. Importantly many of those organizations are moving the discussion on to how they get value from that most valuable of assets.  It is no coincidence that focus of enterprises is to get Insights from the data rather than the handling 3Vs of Big Data. It is great that the focus is on deriving value from data. But I wonder if things happening too fast and some enterprise seem to over simplify their database landscape?

Understanding the Complexity Of Data Landscape

The rapid evolution of business requirements has resulted in organizations ending up with an data landscape that has become incredibly complex.  Many organizations are significantly overspending on managing that complex bloated data landscape. The European Data Protection Regulation became applicable on May 25th, 2018 in all member states to harmonize data privacy laws across Europe.Organizations have a huge variety of databases including tabular relational databases, columnar databases, NoSQL databases and the list just goes on.  Organizations have reached this point because they had to meet their business needs. The databases they had were not able to support what they needed to do when they needed to do it.

Tackling the Complexity Of Data Landscape

I believe it is time Organizations should STOP overhauling their data landscape and look for an approach that drives towards a new Data Architecture Vision. It is time to take stock. Think simplification of the data landscape while continuing to meet the business needs today and of the future. Defining a fresh Data Vision and simplification of Data Landscape will help with costs and manageability and help adhere to new Data Protection Laws.  By reducing complexity at source organizations will be better set to use data to create value rather than passing on chaos and complexity to value creators! The evolution of database technologies has been almost as relentless as the progress in other areas of software. Today SQL Server can run on Linux.  Would that make you consider if an open source database is really better than an enterprise grade best in class equivalent you can now use when security and reliability around data is going to underpin everything you do? Look at the fact Graph processing is available in SQL Server and that machine learning capabilities are now pervasive in databases with SQL supporting Python and R.  Would that change the need to create separate data marts for analytics processing reducing complexity and data sprawl?

New deployment options

Finally lets look at the new deployment options.
  • Flexible agreements that let you move to the cloud incrementally
  • Moving from on premise to  the cloud unchecked lets you reduce the overhead of hardware and having to deal with Capital Expenditure
  • Using managed services in the cloud with powerful SLAs to reduce administration overhead while enabling new modes of data storage to support emerging business needs
  • Building Hybrid solutions that span into the cloud as needed
  • The capability to stand up what you want when you want it and have all that handled with super clear SLAs.
The modern data estate is available on-demand. It spans all deployment modes, offers almost every type of database you might need and helps you find the right ones to meet your business needs. Options abound for simplification, consolidation, modernization and agility within your data landscape all without compromising on meeting your business needs.

Moving forwards

The forwards momentum in database capability and their deployment options  is staggering. Many organizations are not on top of that. Previous decisions, even from as little as 12-18 months ago, can now be revisited to see if your data landscape is running as efficiently as possible.
It is a known fact that progressive organizations, some already because of GDPR, are busy documenting their data assets. In most cases better than ever before. Most of them are focused on what data is where though and how to secure it and ensure it is used appropriately.
Many are not looking at which database it is being stored and if migration and/or consolidation could make life much easier. Be sure to think about your data landscape and consider how it can evolve.
Here are some questions:
  1. Have you recently looked at where you are storing your data and do you understand why you have it there? Have you evaluated if there a better option today?
  2. Do you know how much it is costing you to manage and maintain your data estate and could reduced complexity reduce that? If lowering IT costs is on your radar this is a sure fire way to find ways to do that.
  3. Have you considered if your GDPR compliance would be easier with a less complex environment to manage? Is database consolidation an option you considered on your GDPR journey? If not why not?
  4. When did you last evaluate which databases need to be on-premise, which can be deployed in a hybrid mode and which should be able to be totally moved to the cloud? If not recently you may be constraining your potential based on old options and adding additional costs you do not need.
  5.  

In Conclusion

A modern data estate will provide options to meet you where you need it to. As you consider your data landscape moving forwards you might want to think about if you are missing a trick by not thinking big picture and looking for vendors who can, perhaps together with partners, cover the entire data estate and all that entails.I have written about need for a Vision & a Road Map for an enterprise and that applies for Data Strategy. as well. The speed at which technologies are evolving and the rate at which new technology get adopted every CTO and CIO should review the Enterprise Data Vision every year and do the necessary change to the Road Map.


Understanding Generative AI and Generative AI Platform leaders

We are hearing a lot about power of Generative AI. Generative AI is a vertical of AI that  holds the power to #Create content, artwork, code...