Sunday, May 19

Why private and government sector in #India needs to re-evaluate their Data Strategy?

                       

Data Strategy – Time to re-evaluate?

It seems a long time ago that the  3 V’s of volume, variety and velocity was unleashed on the world to describe the evolution of Big Data that organizations were about to see. For years we have been told that we needed to get ready for a new Data Tsunami . We need to be ready to store more data, take data that might not look like we had traditionally from operational systems (such as textual unstructured data) and handle data arriving more quickly.  This was in the web era before mobile and social media took off. Then we had the Big Data storm where all the V’s got bigger, faster and more diverse. When Social Media arrived and the use of external data to help make decisions became a norm the Big Data is everywhere, so much that we seem to have stopped talking about it.

An emerging ecosystem of options


 To deal with big data we needed new ways to store data. This led to the emergence of a new ecosystem of database options to support different needs. New model/schema databases were created with new query approaches to overcome gaps in what was available. Over the time most companies adopted a modified data landscape including NoSQL databases rather than adapting “Hadoop Based Data lake”.What seems to be lacking is a sound understanding of the new COMP-LEXER landscape of data-sources and databases and urgent a need to have a fresh Vision and a new road map for Enterprises Data Strategy.

When Big Data is everywhere, Big Data is just another Data

Today most organizations have stopped thinking about “Big Data” as a challenge that need to be addressed. Now it is just the data that they have to handle to meet different business requirements. Importantly many of those organizations are moving the discussion on to how they get value from that most valuable of assets.  It is no coincidence that focus of enterprises is to get Insights from the data rather than the handling 3Vs of Big Data. It is great that the focus is on deriving value from data. But I wonder if things happening too fast and some enterprise seem to over simplify their database landscape?

Understanding the Complexity Of Data Landscape

The rapid evolution of business requirements has resulted in organizations ending up with an data landscape that has become incredibly complex.  Many organizations are significantly overspending on managing that complex bloated data landscape. The European Data Protection Regulation became applicable on May 25th, 2018 in all member states to harmonize data privacy laws across Europe.Organizations have a huge variety of databases including tabular relational databases, columnar databases, NoSQL databases and the list just goes on.  Organizations have reached this point because they had to meet their business needs. The databases they had were not able to support what they needed to do when they needed to do it.

Tackling the Complexity Of Data Landscape

I believe it is time Organizations should STOP overhauling their data landscape and look for an approach that drives towards a new Data Architecture Vision. It is time to take stock. Think simplification of the data landscape while continuing to meet the business needs today and of the future. Defining a fresh Data Vision and simplification of Data Landscape will help with costs and manageability and help adhere to new Data Protection Laws.  By reducing complexity at source organizations will be better set to use data to create value rather than passing on chaos and complexity to value creators! The evolution of database technologies has been almost as relentless as the progress in other areas of software. Today SQL Server can run on Linux.  Would that make you consider if an open source database is really better than an enterprise grade best in class equivalent you can now use when security and reliability around data is going to underpin everything you do? Look at the fact Graph processing is available in SQL Server and that machine learning capabilities are now pervasive in databases with SQL supporting Python and R.  Would that change the need to create separate data marts for analytics processing reducing complexity and data sprawl?

New deployment options

Finally lets look at the new deployment options.
  • Flexible agreements that let you move to the cloud incrementally
  • Moving from on premise to  the cloud unchecked lets you reduce the overhead of hardware and having to deal with Capital Expenditure
  • Using managed services in the cloud with powerful SLAs to reduce administration overhead while enabling new modes of data storage to support emerging business needs
  • Building Hybrid solutions that span into the cloud as needed
  • The capability to stand up what you want when you want it and have all that handled with super clear SLAs.
The modern data estate is available on-demand. It spans all deployment modes, offers almost every type of database you might need and helps you find the right ones to meet your business needs. Options abound for simplification, consolidation, modernization and agility within your data landscape all without compromising on meeting your business needs.

Moving forwards

The forwards momentum in database capability and their deployment options  is staggering. Many organizations are not on top of that. Previous decisions, even from as little as 12-18 months ago, can now be revisited to see if your data landscape is running as efficiently as possible.
It is a known fact that progressive organizations, some already because of GDPR, are busy documenting their data assets. In most cases better than ever before. Most of them are focused on what data is where though and how to secure it and ensure it is used appropriately.
Many are not looking at which database it is being stored and if migration and/or consolidation could make life much easier. Be sure to think about your data landscape and consider how it can evolve.
Here are some questions:
  1. Have you recently looked at where you are storing your data and do you understand why you have it there? Have you evaluated if there a better option today?
  2. Do you know how much it is costing you to manage and maintain your data estate and could reduced complexity reduce that? If lowering IT costs is on your radar this is a sure fire way to find ways to do that.
  3. Have you considered if your GDPR compliance would be easier with a less complex environment to manage? Is database consolidation an option you considered on your GDPR journey? If not why not?
  4. When did you last evaluate which databases need to be on-premise, which can be deployed in a hybrid mode and which should be able to be totally moved to the cloud? If not recently you may be constraining your potential based on old options and adding additional costs you do not need.
  5.  

In Conclusion

A modern data estate will provide options to meet you where you need it to. As you consider your data landscape moving forwards you might want to think about if you are missing a trick by not thinking big picture and looking for vendors who can, perhaps together with partners, cover the entire data estate and all that entails.I have written about need for a Vision & a Road Map for an enterprise and that applies for Data Strategy. as well. The speed at which technologies are evolving and the rate at which new technology get adopted every CTO and CIO should review the Enterprise Data Vision every year and do the necessary change to the Road Map.


Sunday, May 12

Good Read - 3 Simple Habits to Improve Your Critical Thinking By Helen Lee Bouygues

As a Consultant Architect who works with multiple projects in my organization I frequently meet Project Manager to discuss the health of their project. Most of the project managers assure me that their project is doing fine, the client is happy with their delivery and the team members are happy with the leadership only to realize in couple of months that the project got into a problem on all these fronts.  What was the problem ? Surely a executive does not lie to his own organization so how did he not anticipate the problem? Should a manager not be aware of the facts? Problem is lack of Critical Thinking. When leaders accept the facts that are presented to them, when the leaders do not spend time in investigating and verifying the facts..


Many business leaders are simply not reasoning through pressing issues, taking the time to evaluate a topic from all sides. Leaders often jump to the first conclusion, whatever the evidence. Even worse, C-suite leaders will just choose the evidence that supports their prior beliefs. A lack of meta-cognition — or thinking about thinking — is also a major driver, making people simply overconfident. Critical thinking is the process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and evaluating information to reach an answer or conclusion.

Three simple things to improve your critical thinking
  1. Question assumptions
  2. Reason through logic
  3. Diversify thought
 Do read 3 Simple Habits to Improve Your Critical Thinking by Helen Lee Bouygues

Wednesday, May 1

Robotic Process Automation - BPM in new bottle ?

"Is robotic process automation really a new thing or just a new name for Business Process automation?"  Everybody has this question. RPA’s roots can be traced through the evolution of software robotics, so it may look similar to business process management. RPA is really a subset of  BPM and it really focuses on automating mundane human tasks and processes.

BPM (sometimes used interchangeably with business process automation) isn’t a specific piece of software but an approach to streamlining business processes for maximum efficiency and value. It is an in-depth look at how processes are operating, identifying areas for improvement, and building solutions – usually from the ground up. BPM is about making sure the infrastructure of your business processes is solid and decoupled from the implementation  technologies. RPA, on the other hand, is designed to operate processes as a human would, so it exists on a more surface level. It’s faster to implement, ready to use with almost any software, and easily altered or updated to adapt to the changing world. As far as I see it, RPA and BPM are not in conflict with each other. They both have the same goal with different implementation strategies.

While you certainly could use RPA to handle high frequency processes which had previously been performed by humans, perhaps what is really needed is an overhaul of your workflow. If a certain type of transaction makes up the bread-and-butter of your organization’s service, for example, you’ll want to make sure that process is as tight, efficient, and self-contained as possible. There are times when you have to transform the process itself rather than relying on a surface-level fix. That’s is a good case to use BPM.

But transforming a business structure isn’t always feasible. It requires a lot of development and a lot of investment (time and money). You may not have the luxury to build from the ground up. That’s when RPA may be the most fitting solution. If nothing else, you can use RPA to continue operations while investigating a deeper fix.
                   Consider this analogy to self-driving cars: a BPM approach would require us to rip up all paved roads and install infrastructure for the new cars to move about on their own, while an RPA approach seeks to operate an existing car just as a human would. Google's self driven car is addressing  the problem from an RPA angle, because replacing all roads (especially in the U.S.) is just unfathomable. That’s not to say that RPA is always the better option – not at all. The key is knowing the difference and using both tactics to their best advantage.



  • Low technical barriers: Programming skills are not necessary to configure a software robot. As a primarily code-free technology, any non-technical staff can use a drag and drop process designer to set up a bot—or even record their own steps to automate a process through a process recorder feature.
  • Increased accuracy: Bots are extremely accurate and consistent – they are much less prone to making mistakes or typos than a human worker.
  • Meet regulatory compliance standards: Bots only follow the instructions they have been configured to follow and provide an audit trail history for each step. Furthermore, if steps in a particular process need to be reviewed, bots can also play back their past actions. The controlled nature of bot work makes them suited to meeting even the strictest compliance standards.
  • No interruption of work: Operations can be performed 24/7 as these bots can work tirelessly and autonomously without requiring staff to manually trigger bots to initiate business processes. If a human does need to intervene, it is to make a decision or resolve an error.
  • Existing systems remain in place: Unlike traditional automation initiatives that may require extensive developer resources to integrate across multiple applications, RPA involves no disruption to underlying systems. Robots work across the presentation layer of existing applications just as a person does. This is especially useful for legacy systems, where APIs may not be immediately available, or in situations where organizations do not have the resources to develop a deep level of integration with existing applications.
  • Improved employee morale and employee experience: Employees will have more time to invest their talents in more engaging and interesting work. Bots enable workers to offload manual offload tasks like filling out forms, data entry and looking up information from websites, so workers can focus on strategy and revenue-producing activities.
  • Increased productivity: Process cycle times are more efficient and can be completed at a faster speed compared with manual process approaches.
I remember automating Java Build and Configuraton management process using Perl scripts 15 yrs back- from verifying that the code is unit tested, checked into version control, code review are completed and then deciding if the code should be pulled into the next build, all was done by automated scripts. We did not call it RPA but if you know what I am talking about you will realize it was nothing but Robotic Process Automation without using a sophisticated tools. If you are keen to know how I automated Java build process, send me a message and  I will post a blog about it.

Wednesday, April 3

Why is prototyping important & critical step of software development?

I was working for a client who had bought a software package from one of the ' Market Leading Software Vendor'. The software package  promised to deliver out of the box capability for all his software services requirement out of the box. The Software Vendor'. impressed with  did an impressive demo of  3 of the services  and the client was happy that the product was going to deliver software services in couple of months instead of their development estimates of 12 months to 18 months.The only catch was the 'Software Vendor' told the client that 'most' of the services were ready and 'few' were under development but they would accelerate the development  and try to deliver them them ASAP. The client team accepted the 'Software Vendor' verbal assurance and decided to go with the packaged software (#Mistake) .
          The client bought the software and engaged my company to implement the package software, with a target to complete the implementation in 4 months as per our assumptions & estimates. As the development started we started getting issue with packaged software code, some of the 'out of the box' services were not ready and other had code quality issue - apparently  the code had not been tested.diligently. The 'Software Vendor' told the Client that we should prioritize implementing other services that were ready. The Client was not happy and neither were we as the software implementation team, but Client decided to go ahead and  we started implementing the 'new' services which we were told to implement first by the 'Software Vendor'.

After couple of weeks we realized new set of services were not ready as well and so the client setup a meeting with the 'Software Vendor'. In the meeting we and the client decided to review each of the services that 'Software Vendor' had promised were out of the box and we realized that more than 80% services of the package that was sold to our client was only only on paper and 'work in progress' and they would only be available to us after 12 months. The client threatened the 'Software Vendor' to cancel the purchase and walkout but realized that he has invested quite a lot of money in 6 months that we had worked on the 'Software Vendor' packaged software implementation. It would be suicidal for the client IT team to go back to its management and tell them that they had paid high price for a software from 'Market Leader Software Vendor' without performing a due diligence and after 4 months of work realized that the software was not fully ready! There was no option for the Client team to wait till 'Software Vendor' delivered and push the delivery time lines. The revised budget increased to 1.5X and later to 2X and in spite of this the final product that that Client got was full of bugs (which the 'Software Vendor' promised to fix over in future) and it did not implement some of the services that they had 'purchased 'from the .'Market Leader Software Vendor'.

The moral of the story is that the client made some basic mistakes and did not follow the guidelines of buying a 'Out of the box software package'.  The client obviously was at fault of finalizing the deal with the vendor before contacting my company for implementation. I was not part of the initial meeting with the client so I can't say if my company representatives had advised the client to verify the readiness of the product he was going to buy and whether we had advised a prototyping phase as we should have done.


Why is prototyping important ?

1. Prototypes Help Transmit Intent
Software Packages aren’t always custom fit for your needs. By creating a prototype, the people involved at the earliest stages can better convey their vision for the software and what it’s actually intended to do — and this works better than just describing it through notes.

2. Prototypes Allow for More Customer Involvement
They also allow for that interaction to happen earlier in the design process, when it’s easier to make changes to the software. It’s not uncommon for buyers to ask for one thing, only to realize later on that what they asked for doesn’t work as well in practice as they expected it to… and it’s better for everyone if such problems are found early on. At the same time, prototypes are a good mechanism for explaining what’s technically feasible with the software — and once people know what it can do, they can turn their attention to what they want it to do.

3. Prototypes Provide Users Proper Clarity of Feel for the Software’s Functionality
Related to #2, prototypes serve as a good chance for users to get a feel for what kind of functionality the software will provide once it’s done. Now, at this stage, even the basic functions are far from complete — chances are the software won’t be doing anything more than the simplest tasks. However, that’s still enough to get a good sense of how it’s going to behave when it’s actually done. Without prototyping, the software could end up feeling wrong to the users — and that doesn’t help productivity.

Here is the right way to invest in a 'Out Of the Box Software Package Product' and this applies to every industry vertical.

  1. When you buy package software try not to be the 1st implementer of the product is my 1st advice
  2. Don't get impressed with vendor's demo of few services of packaged software. Demand a complete demo of entire implementation of the software services that you are going to buy deployed on a similar software environment and configuration as yours
  3. Form a team of technical and domain analyst to explore the demo environment, create a review checklist that covers all critical aspects , review each service in the software package and submit a detailed review report
  4. Make sure the software package implementation can be customized as per your requirement
  5. Look at the code of the software and so a sample review of the code quality - 'Software Vendor' should not have a problem in allowing you do review the code you are investing in! 
  6. Finally insist on a prototyping phase and implement few critical services on a replica of your production setup and test the prototype once it is implemented. You can also do a round of performance testing on the prototype
  7. Make sure the users in your enterprise are involved in all the above stages and they provide their feedback.
  8. Once  above steps are completed with satisfactory result you can go ahead and invest in the 'Software Vendor'
The above checks apply to almost any software small or large that you invest in. I hope sharing my experience will be helpful to enterprise & people buying 'Vendor Software Packages'

Friday, March 29

Understanding mongoDB by comparing it with Oracle RDBMS

When I started working on mongoDB the challenge was to train my team on mongoDB and we had to create a workshop so that people could seamlessly transition to MongoDB. I am going to share few pointers and tools that you can use if you are working on mongoDB.

I assume you are familiar with some RDBMS.If you have worked with Oracle or any other RDBMS then it is not quite difficult to pick a NoSQL database like MongoDB. Oracle is an object-relational database system that comprises of table-column-row (TCR) structure. The data in oracle database is stored logically in tables. These tables are then logically grouped in table-spaces. Oracle database physically stores data in data files. Table-spaces contain segments. These segments are made of one or more extents. An extent is the collection of contiguous data blocks. Thus data blocks are basic units of data storage in Oracle.

In comparison to Oracle mongoDB is a NoSQL document-oriented database. It is basically a class and object (CO) structured database. mongoDB contains multiple databases. Each database comprises of collections. A collection is formed of one or more documents. These documents contain fields where data is stored in the form of key-value pair.



I have shared a quick reference table to relate Oracle/RDBMS and mongoDB. In the next post I will share some sample code to understand how data is accessed in mongoDB as compared to RDBMS and it is quite different  from the plsql code you have been used to.

Wednesday, March 6

Root Cause Analysis - Everybody should be able to do it!

I begin with a favorite quote from a man I hugely respect.

“Every defect is a treasure, if the company can uncover its cause

and work to prevent it across the corporation.”
– Kilchiro Toyoda, founder of Toyota


When we want to excute a project we create a plan.
While you excute the plan things may not go according to the plan so we monitor the plan execution at reguar intervals.
Assuming something goes wrong during execution of the plan we should do 'Root Cause Analysis' to find

  • What is the problem & What is the frequency of occurance
  • Why did problem occur or what is the trigger or cause of the problem 
  • When does the problem occur, is it a recurring problem
  • How to fix the problem, What is the best solution
  • How to improve existing processes to avoid reoccurance of the problem. 


There are many diffrent techniques to conduct Root Cause Analysis. When I started working as a software engineer I was not familiar with tools and process to do Root Cause Analysis and I had to do study the resources on intenet to learn the various techniques. Later on when I started working for a MNC  I had access to company's own customized training on Root Cause Analysis. The training was one fo the optional training, meaning it was not mandatory for all software developers and I had tried to convince our HR to make the training mandatory for all employees because problem solving is an essential skill and one should know about the ideal process and the tools available for Root Cause Analysis.

Anyway what surprises me is that even today when I meet some manager from diffrent industried I find many of them are not familiar with Root Cause Analysis (RCA) technique & tools. There are many different RCA techniques and you can follow any well defined technique. Manager should know that RCA is not some elite training and it is a essential skill. Unless every team member is trained to do Root Cause Analysis, the enterprise will not be able to know the real reasons for a problem, won't be able to fix problems or will take longer time to fix them and the impact to the enterprise is huge in terms of quality, efficiency and productivity.


Here is a self explanatory  Root Cause Analysis excel template that you can customize or use it as it is.
(Link)

And here is a picture to summarize the  Root Cause analysis  process steps. there are many free trainings on RCA and I would recommend following the Sig Sigma DMAIC strategy for conducting Root Cause Analysis. DMAIC as you may be aware is acronym for five interconnected phases: Define, Measure, Analyze, Improve, and Control.

Sig Sigma Problem Solving Steps
The above image is from Tayor Enterprise Inc


Good Read - Crisis preparedness trends 2019 - A PWC Report

Good Read - Crisis preparedness trends 2019 - A PWC Report

Sharing a PWC Report that I find interesting. Read the last section in particular.

https://www.pwc.com/gx/en/ceo-survey/2019/Theme-assets/reports/crisis-preparedness-report-2019.pdf

Copyright https://www.pwc.com/

Tuesday, March 5

Will Digital War Be Alternative To Nuclear War?

Countries with nuclear weapons and those without nuclear weapons is how the world is classified now. Not by size of the army but the size of the nuclear arsenal. I wonder if we will  even know who has how much nuclear capability and which country is just bluffing about their nuclear capability. I seriously doubt our neighbor country when they claim to have a nuclear weapon. A country could not manufacture a single motorcycle in 70yrs claims to have built a nuclear weapon sounds hollow .Hell  I would claim I have couple of nuclear weapons in my garage!

We are living in a connected world.
  • Everything talks to everything - Man talk to machine, machine talk to machine, machine talk to data
  • Evert action is an event, all events are being recorded & stored in a connected databases
  • Events can be analyzed in real time and valuable actionable insights can be extracted 
  • AI is being used to teach machines WHAT we could not have programmed otherwise
  • Programming a drone or a car was an impossibility but AI has made it possible today
  • Defense industry has been in the fore front to embrace revolutionary technologies like AI 
  • A war in 2020 is going to be much different from the past wars
  • Operation Neptune Spear that eliminated Osama - identify, eliminate & retreat was a mere glimpse of gadgets of modern war fare that leveraged data analytics & AI
  • Real time AI (Artificial Intelligence) Driven War will be the new Nuclear.
  • Digital War will be fought on land, air and water so an integrated defense force command will be required on the lines of Chief of Defense

In short Satellite imagery, Sensor devices, Internet, Massive Parallel Processing, Video & Predictive Analytics, Natural Language Processing, Big Data & AI are real and everything under the sun is under the scanner.How do these technology advances impact modern warfare?
  1. Satellites can have your entire country covered and create huge amount of information every second. 
  2. The new generation instrumentation and information technology are able to process data at 10X speed
  3. Analytics software are processing data and predicting events in real time with high precision
  4. Artificial Intelligence is making machines capable of performing intelligent tasks like human beings
  5. NLP is enabling machine understand and learn languages like a human by observing human interaction
  6. AI & ML solutions are really working. Their error rate is lower than that of trained  humans
Now visualize a war zone scenario and apply these technologies to build solution for warfare
  1. Any object that moves is being observed by the satellites, radars and sensors
  2. Visualize an octagonal central war room is showing satellite feeds from 8 borders on 8 walls
  3. Satellite feed is being analyzed in real time (on the fly) and analysis is being projected on screens
  4. 8 different teams of Indian Army Command are monitoring 8 screens displaying activity on borders
  5. Every movement of enemy is captured and analyzed in real time
  6. 1st movement of enemy is being matched with potential Enemy Strategy and helpful alerts are flashed on the Indian Army Command room screens.
  7. Even before enemy can make the 2nd move Predictive Analysis is predicting  the potential next move of the enemy
  8. A self learning supercomputer , let's call it 'Supercomputer Maneckshaw' is matching event patterns with 'Strategy Database', predicting next strategic move of enemy & projecting actionable inputs on the screen
  9. By the time the enemy completes its 2nd move Indian Army, Navy & Air-force have taken positions and Army Command Center issues instructions for counter move to Army/Navy/Air-force based on suggestion by  Supercomputer Maneckshaw
  10. By the time the enemy makes a 3rd move Army Command Center has identified the compete Enemy Strategy & suggestion counter strategy for Indian forces to the Chief of Defense of Armed forces by sending a notification on their mobile devices
  11. Chief of Defense views action suggested by Supercomputer Maneckshaw, conducts a secured online conference with 3 chiefs & gives orders. 
  12. Humans soldiers have unlimited memory but limited RAM but super computer has found references of identified enemy strategy to a war strategy from 1939 (German Invasion of Poland)
  13. Supercomputer Maneckshaw can map events, identify potential strategies but it still takes a human to read the mind of current Pakistani general. So final decision rests with the 3 chiefs and the supreme commander
In summary AI Warfare or 'Real Time AI Warfare' as I call it
  • The AI Warfare will be 95% super computer intelligence and 5% human intelligence. 
  • The AI warfare will be definitive than Nuclear War because we will be able to read the mind of enemy in real time as the enemy makes its moves and as the enemy changes its mind.
  • AI warfare will be cost effective because the resources would be used optimally and deployment would be managed in real time.
  • AI warfare will always be preemptive because it will be driven by predictive intelligence driven by all the past wars that have been fought in the world history and it will be unpredictive because the super computer system will evolve continuously!



Good Read - Forrester Report on The Future Of IT

Sharing a good read The Future Of IT by  Forrester
 

Thursday, February 28

2019 Magic Quadrant for Data Science and Machine Learning Platforms - Tibco Rapid Miner XNIME SAS as well as Dataaiku & Alteryx

The Leaders quadrant is the place for companies whose vision is aligned with their customer’s needs and who have the resources to execute that vision.
  • RapidMiner and KNIME continue to be in the Leaders quadrant this year
  • RapidMiner has the edge in ability to execute, while KNIME offers more vision. Both offer free and open source versions.
  • Tibco moved from the Challengers quadrant last year to the Leaders this year.
  • SAS declined from the Leaders quadrant last year to bottom of the quadrant.
The companies in the Visionaries Quadrant are those that have good future plans but which may not have the resources to execute that vision.
  • Mathworks moved forward substantially in this quadrant due to MATLAB’s ability to handle unconventional data sources such as images, video, and the Internet of Things (IoT).
  • H2O.is also in the Visionaries quadrant.H2O’s strength is in modeling but it is lacking in data access and preparation, as well as model management.
  • IBM dropped from the top of the Visionaries quadrant last year to the middle.
  • Databricks continue to be in Visionaries quadrant
  • Datarobot is new to the Gartner report this year. As its name indicates, its strength is in the automation of machine learning, which broadens its potential user base.
  • Google
  • Microsoft



Friday, February 8

Where can Indian government use AI?


A quick recap for those who are new to AI - Most of the computer-generated solutions now emerging in various industries do not rely on independent computer intelligence. Rather, they use human-created algorithms as the basis for analyzing data and recommending treatments.  By contrast, “machine learning” relies on neural networks (a computer system modeled on the human brain). Such applications involve multilevel probabilistic analysis, allowing computers to simulate and even expand on the way the human mind processes data. which means, not even the programmers can be sure how their computer programs will derive solutions.
                     There’s yet another AI variant, known as “deep learning”. In deep learning software learns to recognize patterns in distinct layers. In healthcare for example, this mechanism is becoming increasingly useful. Because each neural-network layer operates both independently and in concert – separating aspects such as color, size and shape before integrating the outcomes – these newer visual tools hold the promise of transforming diagnostic medicine and it can even search for cancer at the individual cell level.

From the above examples I hope you agree that AI-based programs can help agencies cut costs, free up millions of labor hours for more critical tasks and deliver services better, faster. ​In future AI can help do 'Thinking' for the government but it is early days for that! Today AI programs can recognize faces and speech, they can learn and make informed decisions. AI-based technologies include machine learning, computer vision, speech recognition, natural language processing & robotics. AI is powerful, scalable and improving at an exponential rate. Developers are working on implementing AI solutions in everything from self-driven cars to autonomous drones, from “intelligent” robots to speech translation. The rise of more sophisticated cognitive technologies is of course, critical to advances in several verticals:
  1. Rules-based systems capture and use expert knowledge to provide answers to tricky but routine problems. As this form of AI grows more sophisticated, users may not realize they aren’t conversing with a real person.  For example, an expert system might help a doctor choose the correct diagnosis based on a cluster of symptoms, or use historical game data to help a chess player select tactical moves to play a game and like wise within government, AI  systems can answer a large amount of queries and reduce workload on humans.
  2. Speech recognition transcribes human speech automatically and accurately. The technology is improving as machines collect more examples of conversation. This has obvious value for dictation, phone assistance, and much more. For example, I worked with one police team to implement recording reports using speech recognition software. All police officers are not expert typist so a FIR (First Investigation report) that takes 45 minutes to register can be documented in 15 minutes of less using speech recognition with minimal errors.
  3. Machine translation translates text or speech from one language to another. Significant advances have been made in this field in only the past year. Machine translation has obvious implications for international relations, defense, and intelligence as well as in our multilingual society and has numerous domestic applications. For example a popular book written in English can be translated in Spanish by a computer and save months of  human effort.
  4. Digital vision is the ability to identify objects, scenes, and activities in naturally occurring images. It’s how Facebook sorts millions of users’ photos, but it can also scan medical images for indications of disease and identify criminals from surveillance footage. Soon it will allow law enforcement to quickly scan license plate numbers of vehicles stopped at red lights, identifying suspects’ cars in real time. I was a technical architect for a project we did for French government where we did prototype of a system to identify wanted criminals in public transport.
  5. Machine learning as we know takes place without explicit programming. By trial and error, computers learn how to learn, mining information to discover patterns in data that can help predict future events. The larger the data sets, the easier it is to accurately gauge normal or abnormal behavior. When an email program flags a message as spam, or your credit card company warns you of a potentially fraudulent use of your card, machine learning may be involved. Deep Learning as we discussed is a branch of machine learning involving artificial neural networks inspired by the brain’s structure and function and
  6. Robotics is the creation and use of machines to perform automated physical functions. The integration of cognitive technologies such as computer vision with sensors and other sophisticated hardware has given rise to a new generation of robots that can work alongside people and perform many tasks in unpredictable environments. For examples use of drones, robots used for disaster response, and robot assistants in home health care. Another example is the state of Maharashtra in India is working with WEF to use drones to collect data to improve irrigation system for farming.
  7. Natural language processing performs complex task of organizing and understanding language in a human way. This goes beyond interpreting search queries or translating between Mandarin and English text. How it helps is combined with machine learning, a system can scan websites for discussions of specific topics even if the user didn’t input precise search terms. Computers can identify all the people and places mentioned in a document or extract terms and conditions from contracts. As with all AI-enabled technology, these become smarter as they consume more accurate data—and as developers integrate complementary technologies such as machine translation and natural language processing the scope is infinite.
What is required is a AI task force to be setup to get insight into areas where AI will deliver maximum returns and improve time to process pending work load. From public services to judiciary each organization can speed up the processing with help of AI and NLP. 

MUSTREAD : How can you use Index Funds to help create wealth? HDFC MF Weekend Bytes

https://www.hdfcfund.com/knowledge-stack/mf-vault/weekend-bytes/how-can-you-use-index-funds-help-create-wealth?utm_source=Netcore...