Wednesday, May 1

Robotic Process Automation - BPM in new bottle ?

"Is robotic process automation really a new thing or just a new name for Business Process automation?"  Everybody has this question. RPA’s roots can be traced through the evolution of software robotics, so it may look similar to business process management. RPA is really a subset of  BPM and it really focuses on automating mundane human tasks and processes.

BPM (sometimes used interchangeably with business process automation) isn’t a specific piece of software but an approach to streamlining business processes for maximum efficiency and value. It is an in-depth look at how processes are operating, identifying areas for improvement, and building solutions – usually from the ground up. BPM is about making sure the infrastructure of your business processes is solid and decoupled from the implementation  technologies. RPA, on the other hand, is designed to operate processes as a human would, so it exists on a more surface level. It’s faster to implement, ready to use with almost any software, and easily altered or updated to adapt to the changing world. As far as I see it, RPA and BPM are not in conflict with each other. They both have the same goal with different implementation strategies.

While you certainly could use RPA to handle high frequency processes which had previously been performed by humans, perhaps what is really needed is an overhaul of your workflow. If a certain type of transaction makes up the bread-and-butter of your organization’s service, for example, you’ll want to make sure that process is as tight, efficient, and self-contained as possible. There are times when you have to transform the process itself rather than relying on a surface-level fix. That’s is a good case to use BPM.

But transforming a business structure isn’t always feasible. It requires a lot of development and a lot of investment (time and money). You may not have the luxury to build from the ground up. That’s when RPA may be the most fitting solution. If nothing else, you can use RPA to continue operations while investigating a deeper fix.
                   Consider this analogy to self-driving cars: a BPM approach would require us to rip up all paved roads and install infrastructure for the new cars to move about on their own, while an RPA approach seeks to operate an existing car just as a human would. Google's self driven car is addressing  the problem from an RPA angle, because replacing all roads (especially in the U.S.) is just unfathomable. That’s not to say that RPA is always the better option – not at all. The key is knowing the difference and using both tactics to their best advantage.



  • Low technical barriers: Programming skills are not necessary to configure a software robot. As a primarily code-free technology, any non-technical staff can use a drag and drop process designer to set up a bot—or even record their own steps to automate a process through a process recorder feature.
  • Increased accuracy: Bots are extremely accurate and consistent – they are much less prone to making mistakes or typos than a human worker.
  • Meet regulatory compliance standards: Bots only follow the instructions they have been configured to follow and provide an audit trail history for each step. Furthermore, if steps in a particular process need to be reviewed, bots can also play back their past actions. The controlled nature of bot work makes them suited to meeting even the strictest compliance standards.
  • No interruption of work: Operations can be performed 24/7 as these bots can work tirelessly and autonomously without requiring staff to manually trigger bots to initiate business processes. If a human does need to intervene, it is to make a decision or resolve an error.
  • Existing systems remain in place: Unlike traditional automation initiatives that may require extensive developer resources to integrate across multiple applications, RPA involves no disruption to underlying systems. Robots work across the presentation layer of existing applications just as a person does. This is especially useful for legacy systems, where APIs may not be immediately available, or in situations where organizations do not have the resources to develop a deep level of integration with existing applications.
  • Improved employee morale and employee experience: Employees will have more time to invest their talents in more engaging and interesting work. Bots enable workers to offload manual offload tasks like filling out forms, data entry and looking up information from websites, so workers can focus on strategy and revenue-producing activities.
  • Increased productivity: Process cycle times are more efficient and can be completed at a faster speed compared with manual process approaches.
I remember automating Java Build and Configuraton management process using Perl scripts 15 yrs back- from verifying that the code is unit tested, checked into version control, code review are completed and then deciding if the code should be pulled into the next build, all was done by automated scripts. We did not call it RPA but if you know what I am talking about you will realize it was nothing but Robotic Process Automation without using a sophisticated tools. If you are keen to know how I automated Java build process, send me a message and  I will post a blog about it.

Wednesday, April 3

Why is prototyping important & critical step of software development?

I was working for a client who had bought a software package from one of the ' Market Leading Software Vendor'. The software package  promised to deliver out of the box capability for all his software services requirement out of the box. The Software Vendor'. impressed with  did an impressive demo of  3 of the services  and the client was happy that the product was going to deliver software services in couple of months instead of their development estimates of 12 months to 18 months.The only catch was the 'Software Vendor' told the client that 'most' of the services were ready and 'few' were under development but they would accelerate the development  and try to deliver them them ASAP. The client team accepted the 'Software Vendor' verbal assurance and decided to go with the packaged software (#Mistake) .
          The client bought the software and engaged my company to implement the package software, with a target to complete the implementation in 4 months as per our assumptions & estimates. As the development started we started getting issue with packaged software code, some of the 'out of the box' services were not ready and other had code quality issue - apparently  the code had not been tested.diligently. The 'Software Vendor' told the Client that we should prioritize implementing other services that were ready. The Client was not happy and neither were we as the software implementation team, but Client decided to go ahead and  we started implementing the 'new' services which we were told to implement first by the 'Software Vendor'.

After couple of weeks we realized new set of services were not ready as well and so the client setup a meeting with the 'Software Vendor'. In the meeting we and the client decided to review each of the services that 'Software Vendor' had promised were out of the box and we realized that more than 80% services of the package that was sold to our client was only only on paper and 'work in progress' and they would only be available to us after 12 months. The client threatened the 'Software Vendor' to cancel the purchase and walkout but realized that he has invested quite a lot of money in 6 months that we had worked on the 'Software Vendor' packaged software implementation. It would be suicidal for the client IT team to go back to its management and tell them that they had paid high price for a software from 'Market Leader Software Vendor' without performing a due diligence and after 4 months of work realized that the software was not fully ready! There was no option for the Client team to wait till 'Software Vendor' delivered and push the delivery time lines. The revised budget increased to 1.5X and later to 2X and in spite of this the final product that that Client got was full of bugs (which the 'Software Vendor' promised to fix over in future) and it did not implement some of the services that they had 'purchased 'from the .'Market Leader Software Vendor'.

The moral of the story is that the client made some basic mistakes and did not follow the guidelines of buying a 'Out of the box software package'.  The client obviously was at fault of finalizing the deal with the vendor before contacting my company for implementation. I was not part of the initial meeting with the client so I can't say if my company representatives had advised the client to verify the readiness of the product he was going to buy and whether we had advised a prototyping phase as we should have done.


Why is prototyping important ?

1. Prototypes Help Transmit Intent
Software Packages aren’t always custom fit for your needs. By creating a prototype, the people involved at the earliest stages can better convey their vision for the software and what it’s actually intended to do — and this works better than just describing it through notes.

2. Prototypes Allow for More Customer Involvement
They also allow for that interaction to happen earlier in the design process, when it’s easier to make changes to the software. It’s not uncommon for buyers to ask for one thing, only to realize later on that what they asked for doesn’t work as well in practice as they expected it to… and it’s better for everyone if such problems are found early on. At the same time, prototypes are a good mechanism for explaining what’s technically feasible with the software — and once people know what it can do, they can turn their attention to what they want it to do.

3. Prototypes Provide Users Proper Clarity of Feel for the Software’s Functionality
Related to #2, prototypes serve as a good chance for users to get a feel for what kind of functionality the software will provide once it’s done. Now, at this stage, even the basic functions are far from complete — chances are the software won’t be doing anything more than the simplest tasks. However, that’s still enough to get a good sense of how it’s going to behave when it’s actually done. Without prototyping, the software could end up feeling wrong to the users — and that doesn’t help productivity.

Here is the right way to invest in a 'Out Of the Box Software Package Product' and this applies to every industry vertical.

  1. When you buy package software try not to be the 1st implementer of the product is my 1st advice
  2. Don't get impressed with vendor's demo of few services of packaged software. Demand a complete demo of entire implementation of the software services that you are going to buy deployed on a similar software environment and configuration as yours
  3. Form a team of technical and domain analyst to explore the demo environment, create a review checklist that covers all critical aspects , review each service in the software package and submit a detailed review report
  4. Make sure the software package implementation can be customized as per your requirement
  5. Look at the code of the software and so a sample review of the code quality - 'Software Vendor' should not have a problem in allowing you do review the code you are investing in! 
  6. Finally insist on a prototyping phase and implement few critical services on a replica of your production setup and test the prototype once it is implemented. You can also do a round of performance testing on the prototype
  7. Make sure the users in your enterprise are involved in all the above stages and they provide their feedback.
  8. Once  above steps are completed with satisfactory result you can go ahead and invest in the 'Software Vendor'
The above checks apply to almost any software small or large that you invest in. I hope sharing my experience will be helpful to enterprise & people buying 'Vendor Software Packages'

Friday, March 29

Understanding mongoDB by comparing it with Oracle RDBMS

When I started working on mongoDB the challenge was to train my team on mongoDB and we had to create a workshop so that people could seamlessly transition to MongoDB. I am going to share few pointers and tools that you can use if you are working on mongoDB.

I assume you are familiar with some RDBMS.If you have worked with Oracle or any other RDBMS then it is not quite difficult to pick a NoSQL database like MongoDB. Oracle is an object-relational database system that comprises of table-column-row (TCR) structure. The data in oracle database is stored logically in tables. These tables are then logically grouped in table-spaces. Oracle database physically stores data in data files. Table-spaces contain segments. These segments are made of one or more extents. An extent is the collection of contiguous data blocks. Thus data blocks are basic units of data storage in Oracle.

In comparison to Oracle mongoDB is a NoSQL document-oriented database. It is basically a class and object (CO) structured database. mongoDB contains multiple databases. Each database comprises of collections. A collection is formed of one or more documents. These documents contain fields where data is stored in the form of key-value pair.



I have shared a quick reference table to relate Oracle/RDBMS and mongoDB. In the next post I will share some sample code to understand how data is accessed in mongoDB as compared to RDBMS and it is quite different  from the plsql code you have been used to.

Wednesday, March 6

Root Cause Analysis - Everybody should be able to do it!

I begin with a favorite quote from a man I hugely respect.

“Every defect is a treasure, if the company can uncover its cause

and work to prevent it across the corporation.”
– Kilchiro Toyoda, founder of Toyota


When we want to excute a project we create a plan.
While you excute the plan things may not go according to the plan so we monitor the plan execution at reguar intervals.
Assuming something goes wrong during execution of the plan we should do 'Root Cause Analysis' to find

  • What is the problem & What is the frequency of occurance
  • Why did problem occur or what is the trigger or cause of the problem 
  • When does the problem occur, is it a recurring problem
  • How to fix the problem, What is the best solution
  • How to improve existing processes to avoid reoccurance of the problem. 


There are many diffrent techniques to conduct Root Cause Analysis. When I started working as a software engineer I was not familiar with tools and process to do Root Cause Analysis and I had to do study the resources on intenet to learn the various techniques. Later on when I started working for a MNC  I had access to company's own customized training on Root Cause Analysis. The training was one fo the optional training, meaning it was not mandatory for all software developers and I had tried to convince our HR to make the training mandatory for all employees because problem solving is an essential skill and one should know about the ideal process and the tools available for Root Cause Analysis.

Anyway what surprises me is that even today when I meet some manager from diffrent industried I find many of them are not familiar with Root Cause Analysis (RCA) technique & tools. There are many different RCA techniques and you can follow any well defined technique. Manager should know that RCA is not some elite training and it is a essential skill. Unless every team member is trained to do Root Cause Analysis, the enterprise will not be able to know the real reasons for a problem, won't be able to fix problems or will take longer time to fix them and the impact to the enterprise is huge in terms of quality, efficiency and productivity.


Here is a self explanatory  Root Cause Analysis excel template that you can customize or use it as it is.
(Link)

And here is a picture to summarize the  Root Cause analysis  process steps. there are many free trainings on RCA and I would recommend following the Sig Sigma DMAIC strategy for conducting Root Cause Analysis. DMAIC as you may be aware is acronym for five interconnected phases: Define, Measure, Analyze, Improve, and Control.

Sig Sigma Problem Solving Steps
The above image is from Tayor Enterprise Inc


Good Read - Crisis preparedness trends 2019 - A PWC Report

Good Read - Crisis preparedness trends 2019 - A PWC Report

Sharing a PWC Report that I find interesting. Read the last section in particular.

https://www.pwc.com/gx/en/ceo-survey/2019/Theme-assets/reports/crisis-preparedness-report-2019.pdf

Copyright https://www.pwc.com/

Tuesday, March 5

Will Digital War Be Alternative To Nuclear War?

Countries with nuclear weapons and those without nuclear weapons is how the world is classified now. Not by size of the army but the size of the nuclear arsenal. I wonder if we will  even know who has how much nuclear capability and which country is just bluffing about their nuclear capability. I seriously doubt our neighbor country when they claim to have a nuclear weapon. A country could not manufacture a single motorcycle in 70yrs claims to have built a nuclear weapon sounds hollow .Hell  I would claim I have couple of nuclear weapons in my garage!

We are living in a connected world.
  • Everything talks to everything - Man talk to machine, machine talk to machine, machine talk to data
  • Evert action is an event, all events are being recorded & stored in a connected databases
  • Events can be analyzed in real time and valuable actionable insights can be extracted 
  • AI is being used to teach machines WHAT we could not have programmed otherwise
  • Programming a drone or a car was an impossibility but AI has made it possible today
  • Defense industry has been in the fore front to embrace revolutionary technologies like AI 
  • A war in 2020 is going to be much different from the past wars
  • Operation Neptune Spear that eliminated Osama - identify, eliminate & retreat was a mere glimpse of gadgets of modern war fare that leveraged data analytics & AI
  • Real time AI (Artificial Intelligence) Driven War will be the new Nuclear.
  • Digital War will be fought on land, air and water so an integrated defense force command will be required on the lines of Chief of Defense

In short Satellite imagery, Sensor devices, Internet, Massive Parallel Processing, Video & Predictive Analytics, Natural Language Processing, Big Data & AI are real and everything under the sun is under the scanner.How do these technology advances impact modern warfare?
  1. Satellites can have your entire country covered and create huge amount of information every second. 
  2. The new generation instrumentation and information technology are able to process data at 10X speed
  3. Analytics software are processing data and predicting events in real time with high precision
  4. Artificial Intelligence is making machines capable of performing intelligent tasks like human beings
  5. NLP is enabling machine understand and learn languages like a human by observing human interaction
  6. AI & ML solutions are really working. Their error rate is lower than that of trained  humans
Now visualize a war zone scenario and apply these technologies to build solution for warfare
  1. Any object that moves is being observed by the satellites, radars and sensors
  2. Visualize an octagonal central war room is showing satellite feeds from 8 borders on 8 walls
  3. Satellite feed is being analyzed in real time (on the fly) and analysis is being projected on screens
  4. 8 different teams of Indian Army Command are monitoring 8 screens displaying activity on borders
  5. Every movement of enemy is captured and analyzed in real time
  6. 1st movement of enemy is being matched with potential Enemy Strategy and helpful alerts are flashed on the Indian Army Command room screens.
  7. Even before enemy can make the 2nd move Predictive Analysis is predicting  the potential next move of the enemy
  8. A self learning supercomputer , let's call it 'Supercomputer Maneckshaw' is matching event patterns with 'Strategy Database', predicting next strategic move of enemy & projecting actionable inputs on the screen
  9. By the time the enemy completes its 2nd move Indian Army, Navy & Air-force have taken positions and Army Command Center issues instructions for counter move to Army/Navy/Air-force based on suggestion by  Supercomputer Maneckshaw
  10. By the time the enemy makes a 3rd move Army Command Center has identified the compete Enemy Strategy & suggestion counter strategy for Indian forces to the Chief of Defense of Armed forces by sending a notification on their mobile devices
  11. Chief of Defense views action suggested by Supercomputer Maneckshaw, conducts a secured online conference with 3 chiefs & gives orders. 
  12. Humans soldiers have unlimited memory but limited RAM but super computer has found references of identified enemy strategy to a war strategy from 1939 (German Invasion of Poland)
  13. Supercomputer Maneckshaw can map events, identify potential strategies but it still takes a human to read the mind of current Pakistani general. So final decision rests with the 3 chiefs and the supreme commander
In summary AI Warfare or 'Real Time AI Warfare' as I call it
  • The AI Warfare will be 95% super computer intelligence and 5% human intelligence. 
  • The AI warfare will be definitive than Nuclear War because we will be able to read the mind of enemy in real time as the enemy makes its moves and as the enemy changes its mind.
  • AI warfare will be cost effective because the resources would be used optimally and deployment would be managed in real time.
  • AI warfare will always be preemptive because it will be driven by predictive intelligence driven by all the past wars that have been fought in the world history and it will be unpredictive because the super computer system will evolve continuously!



Good Read - Forrester Report on The Future Of IT

Sharing a good read The Future Of IT by  Forrester
 

Thursday, February 28

2019 Magic Quadrant for Data Science and Machine Learning Platforms - Tibco Rapid Miner XNIME SAS as well as Dataaiku & Alteryx

The Leaders quadrant is the place for companies whose vision is aligned with their customer’s needs and who have the resources to execute that vision.
  • RapidMiner and KNIME continue to be in the Leaders quadrant this year
  • RapidMiner has the edge in ability to execute, while KNIME offers more vision. Both offer free and open source versions.
  • Tibco moved from the Challengers quadrant last year to the Leaders this year.
  • SAS declined from the Leaders quadrant last year to bottom of the quadrant.
The companies in the Visionaries Quadrant are those that have good future plans but which may not have the resources to execute that vision.
  • Mathworks moved forward substantially in this quadrant due to MATLAB’s ability to handle unconventional data sources such as images, video, and the Internet of Things (IoT).
  • H2O.is also in the Visionaries quadrant.H2O’s strength is in modeling but it is lacking in data access and preparation, as well as model management.
  • IBM dropped from the top of the Visionaries quadrant last year to the middle.
  • Databricks continue to be in Visionaries quadrant
  • Datarobot is new to the Gartner report this year. As its name indicates, its strength is in the automation of machine learning, which broadens its potential user base.
  • Google
  • Microsoft



Friday, February 8

Where can Indian government use AI?


A quick recap for those who are new to AI - Most of the computer-generated solutions now emerging in various industries do not rely on independent computer intelligence. Rather, they use human-created algorithms as the basis for analyzing data and recommending treatments.  By contrast, “machine learning” relies on neural networks (a computer system modeled on the human brain). Such applications involve multilevel probabilistic analysis, allowing computers to simulate and even expand on the way the human mind processes data. which means, not even the programmers can be sure how their computer programs will derive solutions.
                     There’s yet another AI variant, known as “deep learning”. In deep learning software learns to recognize patterns in distinct layers. In healthcare for example, this mechanism is becoming increasingly useful. Because each neural-network layer operates both independently and in concert – separating aspects such as color, size and shape before integrating the outcomes – these newer visual tools hold the promise of transforming diagnostic medicine and it can even search for cancer at the individual cell level.

From the above examples I hope you agree that AI-based programs can help agencies cut costs, free up millions of labor hours for more critical tasks and deliver services better, faster. ​In future AI can help do 'Thinking' for the government but it is early days for that! Today AI programs can recognize faces and speech, they can learn and make informed decisions. AI-based technologies include machine learning, computer vision, speech recognition, natural language processing & robotics. AI is powerful, scalable and improving at an exponential rate. Developers are working on implementing AI solutions in everything from self-driven cars to autonomous drones, from “intelligent” robots to speech translation. The rise of more sophisticated cognitive technologies is of course, critical to advances in several verticals:
  1. Rules-based systems capture and use expert knowledge to provide answers to tricky but routine problems. As this form of AI grows more sophisticated, users may not realize they aren’t conversing with a real person.  For example, an expert system might help a doctor choose the correct diagnosis based on a cluster of symptoms, or use historical game data to help a chess player select tactical moves to play a game and like wise within government, AI  systems can answer a large amount of queries and reduce workload on humans.
  2. Speech recognition transcribes human speech automatically and accurately. The technology is improving as machines collect more examples of conversation. This has obvious value for dictation, phone assistance, and much more. For example, I worked with one police team to implement recording reports using speech recognition software. All police officers are not expert typist so a FIR (First Investigation report) that takes 45 minutes to register can be documented in 15 minutes of less using speech recognition with minimal errors.
  3. Machine translation translates text or speech from one language to another. Significant advances have been made in this field in only the past year. Machine translation has obvious implications for international relations, defense, and intelligence as well as in our multilingual society and has numerous domestic applications. For example a popular book written in English can be translated in Spanish by a computer and save months of  human effort.
  4. Digital vision is the ability to identify objects, scenes, and activities in naturally occurring images. It’s how Facebook sorts millions of users’ photos, but it can also scan medical images for indications of disease and identify criminals from surveillance footage. Soon it will allow law enforcement to quickly scan license plate numbers of vehicles stopped at red lights, identifying suspects’ cars in real time. I was a technical architect for a project we did for French government where we did prototype of a system to identify wanted criminals in public transport.
  5. Machine learning as we know takes place without explicit programming. By trial and error, computers learn how to learn, mining information to discover patterns in data that can help predict future events. The larger the data sets, the easier it is to accurately gauge normal or abnormal behavior. When an email program flags a message as spam, or your credit card company warns you of a potentially fraudulent use of your card, machine learning may be involved. Deep Learning as we discussed is a branch of machine learning involving artificial neural networks inspired by the brain’s structure and function and
  6. Robotics is the creation and use of machines to perform automated physical functions. The integration of cognitive technologies such as computer vision with sensors and other sophisticated hardware has given rise to a new generation of robots that can work alongside people and perform many tasks in unpredictable environments. For examples use of drones, robots used for disaster response, and robot assistants in home health care. Another example is the state of Maharashtra in India is working with WEF to use drones to collect data to improve irrigation system for farming.
  7. Natural language processing performs complex task of organizing and understanding language in a human way. This goes beyond interpreting search queries or translating between Mandarin and English text. How it helps is combined with machine learning, a system can scan websites for discussions of specific topics even if the user didn’t input precise search terms. Computers can identify all the people and places mentioned in a document or extract terms and conditions from contracts. As with all AI-enabled technology, these become smarter as they consume more accurate data—and as developers integrate complementary technologies such as machine translation and natural language processing the scope is infinite.
What is required is a AI task force to be setup to get insight into areas where AI will deliver maximum returns and improve time to process pending work load. From public services to judiciary each organization can speed up the processing with help of AI and NLP. 

Wednesday, January 30

Microservices Architecture - Enabling Agile, Scalable, technology agnostic services

Microservices is an architectural pattern where software is designed as composition of small independent services that communicate over well-defined APIs. The services could be owned by one team or several small independent teams. The advantage of microservices is since each service is independent it can be updated or replaced and redeployed without affecting other services. It is quite easy to scale a micro-service as compared to complex set of services. If we consider the business user of the application then it makes lot more sense to use microservices for those areas where business needs to deliver enhancements with short turnaround time.  

Since I am working on Healthcare Platform let me take example of a traditional  healthcare application developed as a monolithic architecture. This has a significant drawback in limiting the reuse of components in the development of other solutions. For example, if you have a healthcare application targeting people with type 2 diabetes and you want to adapt that solution for managing patients with Chronic Obstructive Pulmonary Disease (COPD) which is linked to diabetes , there is likely a significant portion of your original diabetes solution that you could reuse, but that would be difficult to do.

    Using a chronic disease platform built as microservices, for example, you could replace the component that interfaces and manages data from connected devices. The services that communicated with a companion healthcare mobile application, or service that captures insights from an Amazon Alexa virtual assistant, would be reusable service in these cases. Implementing a microservices architecture provides a number of advantages in architecting back-end solutions while offering particular benefits in developing digital and connected healthcare solutions.


    So 2 key characteristics of microservices pattern are

    1) The microservices is specialized services and addresses one specific problem
    2) Each component service can be updated or replaced ,redeployed and scaled without affecting other services.
    

    Benefits of Microservices

    
    Service Agility
    Microservices Architecture enables small, independent development teams that take ownership of their services. Teams work independently and deliver faster . This shortens development cycle times and business benefits because of fast turn around time..

Highly Scalable Services
Microservices allow each service to be independently scaled to meet demand for the application feature it  supports. This enables teams to right-size infrastructure needs, accurately measure the cost of a feature, and maintain availability if a service experiences a spike in demand.

Swift Deployment  
Microservices enable continuous integration and continuous delivery, making it easy to try out new ideas and to roll back if something doesn’t work. The low cost of failure enables experimentation, makes it easier to update code, and accelerates time-to-market for new features.

Technological Freedom
Microservices architectures enables development teams to choose the best tool and technology to solve the specific problems. As a consequence, teams building microservices can choose the best language and tool for each service component.
    
Reusable Code
Dividing software into small, well-defined modules enables teams to use functions for multiple purposes. A service component written for a certain functionality can be reused to deliver another component. This promotes service reuse.

Application Resilience
Service independence increases an application’s resistance to total failure. In microservices architecture, applications handle total service failure by degrading functionality and not crashing the entire application. In a monolithic architecture, if a single component fails, it may lead to entire application failure.

    
    
    

Have you seen the very interesting 'Jobs of future index' created by cognizant ?

Have you seen the very interesting 'Jobs of future index' created by cognizant ? Their website says "To benchmark the emergence of new jobs, we have created the Cognizant Jobs of the Future Index© (CJoF Index). Our quarterly index explores the trends and patterns of 50 jobs. The index tracks 50 Jobs of future and is supposed to track the growth every quarter.

I  think it is a great initiative as it shows the technology trend and also the future prospects of 50 key technologies. I am going to keep a watch on this index and post my views on my blog.For now go to the URL and check the index or read my views as I update them on this post. 

The 2 job statistics that interest me are CIO & Data Scientist and here is what the report says.
  1. CIO Jobs- For now what I have found is that the CIO/ Director of IT jobs have declined by a fraction from 2016 -2017 but increased by 50% over last 1 year and increased by 1.5% over last 1 quarter - Does that show a slow down this quarter in the very important CIO jobs? I guess not. 
  2. Data Scientist Jobs- There is 80% increase in Data Scientist jobs over last 1 year - lovely!



According to the report over the past year (2018), the fastest growing jobs in the CJoF Index were:
  • Fashion Designer: +279%
  • Solar Engineer: +257%
  • Career Counselor: +181%
  • Social Media Strategist / Specialist: +172%
  • Genetic Counselor: +163%
The slowest growing jobs in the CJoF Index over the past year were:
  • Registered Nurse: +7%
  • Biomedical Engineer: +10%
  • Solar Installer: +12%
  • Home Health Aide: +18%
  • Aerospace Engineer: +27%
I think an index for India would differ from this one and nurses, solar installer, home health aide and even aerospace engineer to some extent would be featured in fastest growing jobs. These industries are picking up and there is enough scope for expansion in sectors like hospitals, solar for home, health care professionals for home. I understand there are few initiatives where indegeneous aircraft manufacturing is also happening in India though I would not call it job growth but there is increasing demand for aerospace service engineers to cater to growing number of air travelers.

Interesting figures.ad I hope #Cognizant modifies the website to filter the demand by country. This post was made on 30 Jan 2019 and will be updated periodically.

Monday, January 21

What will be the 2019 Technology Trends? (Published Jan 2019)

As we enter 2019 here is my prediction of technology disruption & technology trends that will create an impact in 2019. I have listed 8 disruptions and  will add 8 more disruptions in areas of my interest in my next post.



1) Artificial intelligence and machine learning will play a significant role in how IT teams can engage and empower employee & entire organizations. By gaining the right insights they can point out to an employee that the tools and applications available deliver everything they require to carry out their job function.Robotic Process Automation will use AI to automate up to 40% repetitive tasks done by likes of accountants, doctors & even chief executives!

2) Affordable Health Care will reach out to more consumers who at present do not have access to healthcare. Big data, AI, Machine learning & analytics will play a key role in this delivering health care over mobile devices and internet. Self help applications will help people connect to find and connect with health service providers remotely and distance medicine will cover broader spectrum of diseases. For  research companies AI will begin to see fruition, particularly in the imaging diagnostic, drug discovery, and risk analytics applications.

3) Data Authenticity is essential to the decision-making process in business & if the data is unverified it could be detrimental to the outcomes of these decisions. Machine learning is also playing a key role in helping to verify workplace data. Artificial Intelligence & machine learning empower employees & applications will continue to get more intelligent and help improve the user experience and streamline business processes.

4) Very Small Businesses will increasingly use  platform services,mobile applications, blogs, websites & social-media to reach out to consumers & get more business. Platform services will have to play a big role in profiling & validating the service provider to provide reliable quality of service & improve the customer experience.

5) AI driven Cybersecurity along with continuous risk profiling, will become more intelligence-driven, and machine learning will play a critical role in gathering this intelligence. Moreover, machines will start making automated decisions and implement them to minimize organizations cyber-risk. Enterprise goal will be to better curate, correlate, and enrich high-volume security alerts to piece together a cohesive incident detection story across disparate landscape.

6) Game changing Blockchain innovations will rock 2019. I am all for concept of immutable record data (which itself is not a new idea) minus blockchain set of technologies! End of 2019 should see a major shift in the way blockchain is implemented. Enterprises that are still waiting to implement blockchain may finally get 1or 2 reasons to smile.

7) Internet Based Video Services will kill DTH in 2019, new service providers will provide stiff competition and will end monopoly of large DTH operators by 2020. DTH policy & tariff changes along with  penetration of broadband and wi-fi will see exponential growth in internet media consumers and steep churn in existing DTH connections in 2019. We should see a steep increase in people buying internet enabled TV and using home wi-fi to watch video/news. DTH operators who move to wi-fi based service will win a large market share of the WIFI_SITCOM Consumer.

8) Finally something that has worried us since 2016 and is likely to strike in 2019 is IoT Security breach. Hackers can attack unsecured IoT devices to create an extensive botnet, so they could push enough traffic to take down the DNS provider. With the number of connected devices growing rapidly, this is a problem. A world full of connected devices and autonomous things has the potential to be a dangerous world. Not because all these robots, drones and connected fridges will turn against us, but because most likely many of these products will have weak security. Security is not a core competence for IoT manufacturers and hackers have an unfair advantage. Any day hackers will exploit connected devices to create a new, global, attack on internet systems.

Wednesday, January 16

Another learning resource - Quick Introduction To R and an interactive Datacamp tutorial

For those who want to learn R I am sharing a URL of a short and snappy R Tutorial. This will serve as a handly guide for those beginners who are used to learning from ebook/websites. Check this URL https://www.statmethods.net/index.html


For those who like interactive tutorials Datacamp.com has a free interactive R tutorial which is really good for beginners. Check this tutorial at https://www.datacamp.com/courses/free-introduction-to-r

Datacamp also has many advanced R and Python paid tutorials and its subscription starts from USD 29 monthly to USD 200 yearly. If you are lucky you may get special 1 year package deal to all of their tutorials for  USD 99.





 

Thursday, January 3

Does India have a comprehensive IT Strategy ? Why does India need a CTO ?



For years I have asked one question to leaders from government that I have met at seminars & conferences, if #Yahoo & #Google can afford investing in a CTO, why is India not hiring a CTO? India's data will deliver much more value to #Indian Government than the value from data collected by #Google. Political leaders can't even imagine the monetary and value returns from the insights just from the data analysis. Let me reiterate that here I am just referring to 'value that we will derive from data analytics on a pile of data which is really a fraction of the over all value'. Under s CTO the unified-distributed data strategy will deliver exponential value. 

                     There is no doubt that data empowers people and #India is way behind in using its data because our politicians never took help of IT experts except Congress which brought in Nandan Nilekani in 2009, to lead #Aadhar. As of date #Aadhar seems to be messed up pretty badly because of political ambitions of different parties. Every IT software is built on a vision on what needs to be achieved by that software. Once vision is documented ( as in final ) then a plan is created and software is built. It does not mean once you define a vision it cannot change, a vision can change but not drastically. You cannot build a software to distribute social security and then one fine day make it mandatory for buying a SIM card or use Aadhar for taking daily attendance of employee at work as it is being done in Mumbai Municipal Corporation today. 
                      What we as people need to understand is that we cannot have different political parties, scrapping up previous governments implementation and implement something new just to score political brownies or else because the new leader fancies it! In India the current PM seems to announce first and then tell his team to implement it and it does not work that way! We have seen a series of data leaks reported by Tribune new paper and government has not taken any action against tribune so it is safe to assume the data leaks happened. 
                                 Software like Aadhar is custom made and very complex code. If you keep changing it, eventually it will start having issues like data security as we read in news papers and even more complex issues which politicians cannot even imagine. What is worrying Indian IT leaders is that Unique Identification and Bio-metrics of each Indian is at stake and one mistake and you will take away the privacy of common man, does not matter if he is rich or poor. India needs a formal & steady #ITVision vetoed by all parties No politics! Obviously this cannot be done by Rahul Gandhil or Narendra.Modi so government need to rehire an IT expert like Nandan Nilekani or rehire  some one with similar experience to educate the elected politicians and then coordinate with all political parties & help define the India's IT Vision for next 25 years. So if tomorrow if Mamata Banerjee or some other person becomes Prime minister that person should not scrap or completely alter all the #IT Strategy of the past governments. 
                                 What #Indian people should realize is 
  1. The current political leadership is making a mistake by not on-boarding all political parties to define India's IT Strategy. 
  2. Just giving a slogan of Digital India does not make the Prime Minister an IT expert so unless he hires someone like Nilekani, Narayana Murthy or someone with similar experience we will not havea sound IT Strategy
  3. If a political leader imposes his vision on the country without on-boarding key political parties the strategy is bound to fai when he leaves power. 
 Nandan Nilekani is talking about Data Strategy for India but political leaders are yet to understand the need for a long term IT Strategy  that includes a Data Strategy. What India need is a mature political leadership who understand Information & Technology and instead of thrusting their half baked ideas on the country they hire the CTO and let him do the work ( way too late to talk about it but better late than never )!  
 Note - Nandan Nilekani's post How To Empower 1.3 Billion Citizens With Their Data


Mr Nilekani may not agree but the #Vision for Aadhar and subsequently thr #ITStrategy seems to be changing far too frequently. The politicians don''t understand that you do not play flip-flop with IT strategy every 5 yrs. If this game of political one-upmanship continue it can permanently destroy the systems like Aadhar so much that it will have to be redesigned from scratch. 
                                         We have seen that the concerned citizen of India don''t like to confront the government except a few people who filed Public Interest Litigation in #SupremeCourt. It is fortnate that SupremeCourt was wise enough to strike down the  nefarious changes in #Aadhar Guideline that allowed private companies like banks, telecom companies and insurance companies to use Aadhar. At one point of time citizen have to realize that once we elect government it is our job to monitor it and bock any moves that is not good for the country. If citizen don't speak & act against any nefarious law then we are responsible for the state of the nation. You may not understand what is this data and internet revolution that everyone is talking about but you have to know that it internet & related technologies are already affecting lives of people, it is changing the world, and it is affecting everybody including those in the remote villages. Some of these villages don't even have electricity in 2019 but they have mobile connectivity & internet and people actually travel to nearby town to charge their mobile phone (t!)  US elections and even Indian elections saw matriculation and misuse of of citizens data. Some crazy &ignorant person from government agency TRAI shared his Aadhar number and challenged people to hack his bank account. He did not understand the concerns regarding Aadhar. No one is saying that Aadhar is so fragile that if your number is leaked your bank account can be hacked. At the same time do understand that today there is so much of persons data available on internet that by searching and correlating a lot can be found about the person and it can be misused to do commit some fraud. or at least create trouble for you. The concerns about Aadhar are much larger and much different than simply misusing someones Aadhar number. Beware! India's IT & data strategy or lack of it will decide if #India rules the future or goes back dark age 



Note - I have posted a series of posts about on my blog about different concerns with #Aadhar and they can be read over here What's wrong with Aadhar? My blog posts in chronological order 


MUSTREAD : How can you use Index Funds to help create wealth? HDFC MF Weekend Bytes

https://www.hdfcfund.com/knowledge-stack/mf-vault/weekend-bytes/how-can-you-use-index-funds-help-create-wealth?utm_source=Netcore...