Tuesday, May 23

What is Artifical Intelligence infused BPM ?

When I had created Accenture's Point of View (POV) on BPM (comparative study of BPM products or iBPM products as Gartner likes to calls it). We had predicted year on rise of 25% in adoption of BPM by the top 5 industry verticals and year on average productivity improvement of over 10% for enterprises that have implemented matured BPM tools (We had suggested that a mature BPM implementation alone could improve the enterprise productivity by 10%) . Back then Digital Transformation was not a buzz word and we see not just businesses but even governments are aggressively pushing for Digital Transformation. Feels great when I see that today people are well past discussing Need For BPM for Digital Transformation and discussion has moved over to the sensational Artificial Intelligence infused BPM'.
                                                      Artificial intelligence focuses on making already “intelligent” systems capable of simulating human-like decision-making and execution – enabling those systems to perform functions traditionally executed by skilled human professionals – but do so at a much higher level, because of the speed and power available on modern computing platforms.  One needs to understand that for AI TO REALLY HAPPEN the AI software architecture would have to be be similar to our own central nervous system, which controls most of the things we do even though we don’t consciously think about it. So when ever AI matures instead of nerve signals, AI uses algorithms to simulate human behavior.
                                                  Frankly 'what we are implementing today does not have 'human like decision making' capability and that's why we cannot call it AI.  AI is the future and huge investment in research are being done but existing systems do not have intelligence similar to humans because we do not have capability to produce software that has emotional and biochemical aspects of a human brain. What people at large refer to as AI (as of  Jan 2018 ) is actually Machine Learning driven by big data & data mining and which gives insight to improve decision making but there is no Human Like Intelligence as claimed by some companies. Fact remains that the insight from big data aids better and smarter decision making as decision making has definitely improved as we have huge data and technology to process the data at a fast pace.  As such we have been using data insights from historical data to make better business decisions for quite a few years now and if industry decides to call this data insight as AI then we can say AI & BPM are old friends.
                                                                     So if someone tells you he is working on something revolutionary, integrating AI with BPM, you can tell him that AI-BPM is in production for quite sometime - actually for quite a few years (smile)! We did implement Smart Business Process that could be triggered by events from Complex Event Processing framework based on certain event types. We did implement Real Time Big data processing and integrated it with BPM to get insight from Data In Motion and make smarter decisions in real time.  In short we have been doing AI driven BPM for years so don't get stoned by tall claims by some AI-BPM expert!

Point I want to make is that though AI-BPM is not new at the same time AI has been evolving at a fast pace along with ML and we need to continuously  innovate and integrate ML with BPM to get better business insights. What we have already implemented for various industry is a Smart Next Best Action capability that aids a software system to take better decision in real time. Typically NBA is a custom software that uses intelligent insights extracted from big-data processing to aid enterprise decision making and we are using the word intelligent not because system is smart like humans but because it makes decisions based on millions of past records or transactions to recommend the most appropriate action  - something which can almost act like a human not because of intelligence but because of Machine Learning.

Here are some random industry numbers about AI & BPM  -
  • As of today more than 50% of the businesses that are processing Big Data have implemented AI solutions and these businesses have reported more than 50% increase in new business insight from Big Data.
  • AI has helped 50% of implementations to make better business decisions, 20% businesses claim to improved automated communication with help of AI and only 6% businesses claim to have reduced work force by implementing AI
  • Most implemented area for AI is Predictive Analytics (eg Weather Data, Operational Maintenance etc )
  • More than 80% of the implementors claim that AI has improved efficiency & created new jobs
  • Almost all implementors acknowledge that Data Analytic technologies are more efficient when coupled with AI
  •                                                                       

So the intellegence from insight from the huge data is helping make busniness more smart, more proactive, more predictive,  more efficient, more productive and more customer friendly thus opening avenues for new products and expansion of business.

So how is AI or DI changing in the BPM?

  1. Intelligent Recommendations - Continuous machine-learning can  provide relevant recommendation to customer as well as business
  2. Intelligent Marketing - AI can make recommendations to agents or directly to consumers using profile attributes & response behavior and keep learning in real-time, so that the next best offers are relevant to the customer and keep improving over time. Software can help marketing  agent deliver the right recommendations to the right customer at the right time.
  3. Process Automation - Data-insight help reduce workflow inefficiencies, automate human tasks & processes, and reduce repetitive tasks. 
  4. Preferential Treatment to Valued Customers - ML and predictive analytics can estimates a customer’s behavior and guide the agent to both satisfy the customer.
  5. Next Best Action -  NBA helps agents guiding them about the next-best-action to take that will solve a specific problem and lead to higher customer satisfaction and also predicts the sales lead conversion and reduce customer churn
  6. Sales Prediction -  Predictive Analytics helps predicts the likelihood of a lead to close and suggests next best action and strategies to the sales agent.  Predictive engine can identify new sales oppurtunities that may not be outright visible to the team.
  7. Customer Retention - Predictive engine can predict customer churn and also suggest steps required to retain customer.      

Game changing BPM & Data Intelligence/ Artificial Intelligence

What BPM can deliver today is not just efficient and smart process management but a real time business management. The BPM game is changed and BPM is now offering

1) Predictive Business  - Analyze, Sense, Learn, Predict, Act
2) Proactive Recommendation leads to better customer service
3) Reduce Churn by predicting and addressing customer concerns
4) Better value to customer value delivered based on customer insight
5) Better Forecasting by 360 degree view of customer and business
6) Real time enterprise proactively addressing real time events

There are many BPM vendors and vendor analysis by Accenture, Gartner or Forrester can help you decide which BPM vendor has product and features that are right to deliver your solution. Pega, Appian are some of the leading BPM layers of 2018 but there are at least 19 BPM vendors to chose from and you can refer to How to select a BPM (Business Process Management) product? to know how to go about selecting the right BPM product 

Thursday, March 23

Tomorrow of every connected enterprise is Hyper-connected Enterprise

 

Hyper-connected Home

Where do buy sugar from? Pay cash and buy from local grocery store? or order sugar on mobile app? If you are using #BigBasket or #Amazon to buy 1kg Sugar then you should be aware that 100s of computers/devices and at least half a dozen companies,from manufacturer to stockist to cargo delivery to Amazon, all are collaborating over a 'hyper connected network' to get your sugar delivered on time. Which means you are already part of Amazon's Hyper-connected Enterprise where computers, devices and employees create a seamless delivery experience for the customer. Your smart-city is becoming a Hyper-connected city - water, electricity, garbage, emergency services, healthcare, social-welfare everything has either moved or moving to Hyper-connected network of enterprises. You may want to check how companies like Libelium World are helping monitor environment in smart-cities in Spain.

Take another example of entertainment industry. How do you watch your favorite sport? In your living room on large screen TV, live stream on your mobile/tablet/laptop or do you use a dedicated app like Hotstar? Most likely you use all these medium to watch a game of live cricket. Entertainment is being delivered today everywhere and anywhere you choose to be. It is possible because #Hotspot and #Accenture have created a Hyper-connected environment to deliver the content on all medium to deliver content even when you have poor network connectivity. Energy sector suffered huge losses because of electricity leakage and was one of the first to adapt Smart Meter & Smart Energy distribution system. A decade back #USA #Florida based Duke Energy claimed that its electrical system have the capacity to automatically detect, isolate, and reroute power when a problem occurs.
 

Your enterprise too is live 24/7 because the customers expect you to provide service not just on phone between 9 to 5 but at a time that is convenient to them and over a medium of their  choice. The epidemic has only accelerated seamless service delivery over digital medium in a transparent, reliable and secure manner. The enterprise has evolved in many ways and new business models of collaborations are in play to deliver services to the consumer.The digital enterprise is evolving to a new paradigm and it is called Hyper-connected Ecosystem.The biggest change that we are witnessing today is not just the mandate for extraordinary agility and business resilience but also a drastic shift in consumer demand. An evolution of the digital ecosystems that are driving businesses today required organizations, people, devices connecting seamlessly by leveraging an effective hyper-connected ecosystem.

Some leaders talk of Hyperconnected Enterprise as the next phase of Digital enterprise what they don't realize is the evolution had already started years back and you are already on a Hyper-connected Network.The core idea behind Digital Enterprise was always to deliver a services over a connected ecosystem and pandemic has  necessitated innovation to make a business impact and ensure sustainability. The focus today is to enable new innovative open ended technology solutions that are seamless and integration ready to reach the masses. 
 
Blocks of Hyper-connected Enterprise

 
Some enterprises may already have a well designed information system that requires minimal work to become hyper-connected enterprise. The remaining enterprises have to redesign the process ,underlying systems and imbibe the culture of digital dexterity.  The enterprise vision has to change along with the employee mindset to adapt and embrace emerging technologies along with existing technologies to achieve better business outcome and to deliver new products. Not an easy order as it requires enterprise and employees to learn new skills as well as change in culture. Every industry is either moving to become a hyper-connected enterprise or they will have to quickly do the transformation if they intend to stay relevant and compete with the competition who have adopted the new way of doing business. 

The journey

Today enterprises need to be connected to deliver value. From lead generation to fulfillment to customer support all the processes have to be digital.In the next post we will discuss some examples of Everywhere Digital Enterprise. Until everything is connected to everything else...




 

Thursday, January 19

The Evolution of Software Integration

 
Every successful enterprises depend s heavily on underlying software applications and communication between the applications. The problem is that, as time goes by, enterprises invariably end up with software created with disparate technologies and built by several vendors.

The number of software applications varies according to the size of the organization. According to a research small businesses use an average of 10 to 22 applications, and in large enterprises, this number rises to an amazing 700 to 1000 software applications!

The Motivation for Software Integration

All these disparate software applications often need to work together, and this is where software integration comes in. I see various motivations for software integration when I talk to business owners and IT managers. They usually want to achieve one of the following:

  • Produce a unified set of functionalities, for example, a unified customer support system
  • Increase productivity by reducing the need to switch between applications
  • Have easier user adoption, especially if one of the software applications being integrated is new
  • Enable data analytics by getting data from multiple sources
  • Automate data entry – getting data from another application is less costly than manual data entry

In the early stages of software integration, one of the main issues would be that everything was proprietary and closed. You would buy an application, and all the information you put in it was accessible only from within that application. Not to mention that often it would be available on a single machine or on a limited set of machines. If you wanted to, for example, access that information from another software application, you were into trouble.

But when there is a will, there is a way, and so software integration started. The software integration challenges were initially addressed by implementing in-house integration solutions that used ad hoc and proprietary protocols themselves.

Eventually, with the addition of more and more software systems and the wider spread of internal and external networks, home-grown solutions were no longer adequate. Software integration evolution had reached a new level. The motto "no system is an island" became common, but there was still no standard solution to the problem.

Software Integration Evolution & APIs

Over several decades, enterprise integration technologies leveraged one or more integration styles. The earliest integration style was the Remote Procedure Call (RPC) style, used in single-language environments and very popular in the 80s and early 90s. Later, multi-language systems such as CORBA appeared. In these, an explicit Interface Definition Language (IDL) was used, so that the caller and callee could be implemented in different programming languages and even different platforms.

In the end, the use of Application Programming Interfaces, best known as APIs, became the rule. APIs emerged to expose business functionalities provided by one application to other applications. APIs exist so you can reuse them – the concept has been, from the beginning, that multiple applications could consume the same API. The idea was that developers could rely solely on the API to access an application's functionality without worrying about their implementation details.

What Is API Management & Why You Need It

When developers start using an API, they hope that the API is a contract that will not change. However, APIs are susceptible to the same environmental pressures to change that all software systems face. They are often upgraded, reworked, and sometimes refactored. When this happens, finding actual points of change in the API and making things work again is painstaking work for the developer. This is why API lifecycle management appeared.

Developers also hope they will have to work with as few APIs as possible. This is because each new API represents a new learning curve and involves time and effort. Moreover, when you come across the upgrade problems we mentioned, the developer knows it will help to have few APIs and few inter-dependencies - not to fall into spag

Spaghetti Style Software Integration

The thing is that this is not always up to the developer, as the need to integrate different software systems grows.

This is what API management is: API lifecycle management for multiple APIs.

As a result of sheer demand, API management using middleware has emerged as a way of using APIs and getting their advantages, while avoiding their known problems. Please note we are looking at API management from the API consumer perspective. If you look at it as an API producer, then the focus will be different.

API Management Tools As Integration Software Solutions - An Example

The new middleware technologies have eliminated the need to call APIs directly. Instead, the developer writes SQL in their new or legacy code, then use prebuilt connectors to translate standard SQL syntax in that code into API calls. These calls retrieve the needed information from the target system (4). This can work for retrieving data (SELECT) or to input/change it (INSERT, UPDATE, DELETE).

Connect Bridge API management

The middleware acts as a translator that speaks all the API variants the developer needs, translating them into ANSI standard SQL syntax that the developer knows well and can use together with his favorite programming language, such as Python, Java, or C#, just to name a few.

By using such translating middleware, the developer no longer needs to learn a new programming language or gain expertise in the target system API. This makes all the difference, dramatically reducing the time and effort necessary to integrate software.

Using SQL Connector, the developer has two options:

  • he can build his own custom integration software in the programming language of his choice or
  • he can start from the source code of any software from the past 40 years.

In both cases, completing the integration will require few lines of code and be quite straightforward.

Using such middleware also eliminates the need to redo your code when you upgrade the target system or its APIs. The middleware company itself will handle all the maintenance efforts. It is now their job to guarantee forward compatibility (and sometimes backward compatibility too).

Ultimately, API management gives enterprises greater flexibility by enabling them to go for the software integrations they need while shielding them from the negative aspects and not compromising on security, e.g. maintaining GDPR compliance.

Last word

Software integration has long been a pain point for businesses, often leading companies to either maintain their legacy systems for longer than they should or fork over large sums of money on developers to migrate to the latest and greatest.

Fortunately, with software integration evolution, you can easily solve current integration challenges and prepare companies for the future by using today's technology of API management middleware. Whether to simply share data between systems, to modernize legacy systems, or to meet complex requirements, endless integration possibilities are at your fingertips once you start using API middleware.

Wednesday, August 31

The Secrets to DevOps Success - 2

DevOps implementation strategy is key focus of most organizations as they embark on Digital Transformation journey. Though it sounds like a quite a straight forward initiative to automate the software delivery process it has many challenges as I have discussed in the past posts. 

DevOps Market size exceeded $7 billion in 2021 and is expected to grow at a CAGR of over 20% from 2022 to 2028 to a value of over $30 billion. In 2021, 65% of the DevOps market’s value in the USA was made up of DevOps solutions (tools) with 37% accounted for by services. By 2028, around 55% of the market’s value is forecast to be accounted for by DevOps services and the remaining 45% by tools.

In 2001 when I implemented my 1st development process automation it was more about automating redundant manual processes to save time and avoid manual errors in the build and release process. We were a small team delivering a small project for a US client , we used to face failure during every release and it was very embarrassing for the entire team to attend the post release meeting. All we wanted to do was a smooth bug free release without spending all night at the server machines. We automated our build and release process and unknowingly we started working closely together to ensure all the issues we faced in the past did not occur again. We collobarated across teams, we stopped blaming other teams, we learned every step of the code/build/test/release/configure/deploy process ,we automated manual tasks and we monitored every step of the release process. Soon we started doing perfect code drops for every release and we started leaving office together to enjoy post release drinks. We were not doing DevOps but experienced a cultural change and we were working as one team.

Over the last few years we are recommending DevOps to our clients as the right way to do the release thing for their business transformation or the digital transformation journey. What we have observed is that in spite of a large number of new tools, dashboards and on demand infrastructure it is still a big challenge to implement a successful DevOps process in an organization. Lets take a quick look at some of things that can help implement a successful DevOps process.

                                                      To implement technology strategically, businesses need to start by creating a cultural shift toward the principles of DevOps. It starts with people and processes, and then products. You cannot simply throw a tool or even multiple tools at the problem and hope that it will be solved. To transform your business, you need to embrace velocity: making incremental changes, delivering small iterations and going faster. This often means disrupting your own business and cannibalizing your existing offerings before disrupting the market. There are a few key elements of DevOps culture that must be adopted before you begin thinking about your product toolkit.

1) Empower teams by embracing collaboration 

Encouraging collaboration is one crucial way to empower employees. By keeping all stakeholders involved in the process, employees can communicate impact in real time, keeping the execution process moving along. Collaboration enables product manager, development, QA, security and operations teams to work together directly instead of waiting for handoffs. The values of diversity, inclusion and belonging are fundamental to creating a culture of collaboration within your organization. Collaboration across teams, across levels brings in multiple perspectives, and by ensuring that each perspective has a say we invite  innovative ideas, empowered teams and smarter, more informed decision-making. The culture of collaboration has to be driven down the hierarchy by the top leadership leading by example and rewarding collaborators. If collaboration is not one of the KPI for management leaders so far it is time to embrace it now.

2) Iteration Planning

You can go faster by breaking things down into smaller pieces. The smaller we split things up, the smaller the steps we take, the faster we can go.' Smaller iteration are better because they take less time, get delivered faster and there is lower risk and have quick turnaround time encouraging people to be more creative.  I remember my mother telling me to take small bites, chew well and the food will show on you, it worked there and it works everywhere. Encouraging iterations is also a step towards moving away from the stagnant waterfall mentality to developing an agile calculated risk taking attitude.

3) Focus on results

Employees should be acknowledged for what they accomplish and complete, not how long it took them or where they worked. . Create a culture where team members feel trusted to structure their own days and do what it takes to get the results that customers require. Start by finding simple solutions to the problem instead of flashy complicated ones. 

It is impossible to transform a business without setting the mood with collaborative culture.Start by finding ways for collaboration in areas where currently you have silos, iterations where there is stagnancy and efficiency where there are lags


Monday, August 29

The Secret to DevOps Success

Gartner predicted that through 2022 75% of DevOps initiatives will fail to meet expectations due to issues around organizational learning and change and in 2021 Tech Radar Survey indicated 80% of the DevOps initiatives failed to achieve desired goals - mind you project this is the percent of projects that failed to meet the desired goals and expectations. In other words, people-related factors tend to be the bigger challenge while implementing DevOps as compared to implementation technology/tools challenges.

DevOps delivers Maximum value when aligned to customer value

It has been observed that organizations often launch DevOps efforts with insufficient consideration of business outcomes and without clarity of goals. I&O leaders need to ensure that staff and customers connect with the term "DevOps," and the value it will bring, prior to introducing the initiative.

Organizations should use marketing to identify, anticipate and deliver the value of DevOps in a manner that makes business sense. "Leaders must seek to refine their understanding of customer value on an ongoing business to evolve capabilities and further enable organizational change,"

DevOps fails when right team members & organizational change are not managed

In another Gartner 2017 Enterprise DevOps Survey, 88% of respondents said team culture was among the top three people-related attributes with the greatest impact on their organization's ability to scale DevOps. In 2020 TechRadar did a similar survey and over 90% CIOs responded that their priority was to build DevOps culture However, organizations overlook the importance of getting the right mix of staff on board with the upcoming DevOps change and instead focus efforts on DevOps tools.

Tools are not the solution to a cultural problem

It sounds repetitive but I still need to reiterate that "Tools are not the solution to a cultural problem," Organization have to Identify candidates with the right attitude for adopting DevOps practices. Individuals who demonstrate the core values of teamwork, accountability and lifelong learning will be strong DevOps players.

Lack of collaboration affects success of DevOps

Successful DevOps efforts require collaboration with all stakeholders. More often than not, DevOps efforts are instead limited to I&O. Organizations cannot improve their time to value through uncoordinated groups or those focusing on I&O exclusively.

Break down barriers and forge a team-like atmosphere. Varying teams must work together, rather than in uncoordinated silos, to optimize work. "This might start with seeking out an executive who can carry the teams and champion the effort,"

Trying to do too much too quickly

It is important to realize that a big-bang approach — in other words, launching DevOps in a single step — comes with a huge risk of failure. DevOps involves too many variables for this method to be successful in a large IT organization.

Is is recommended to use an incremental, iterative approach to implement DevOps to enable the organization to focus on continual improvements and ensure that all groups are collaborating. Is is recommended starting with a politically friendly group to socialize the value of DevOps and reinforce the credibility of the initiative.

Unrealistic expectations of DevOps

Similar to struggling with grounding DevOps initiatives in customer value, a disconnect exists in many organizations between expectations for DevOps and what it can actually deliver.

"Expectation management and marketing are continuous and not a one-time affair"

Manage expectations by agreeing on objectives and metrics. Use marketing to identify, anticipate and satisfy customer value in an ongoing manner. Expectation management and marketing are continuous efforts, not a one-time affair.The bottom line is unless entire organization understand develops and appreciates the benefit of DevOps and take efforts to collaborate and work to bring a cultural change across Development, Testing and Operations teams, DevOps will not be successful.

Sunday, June 5

Why todays digital enterprises need DevOps?

There was a time in pre-digital era when a New business idea would disrupt the business landscape. Take example of the restaurant industry that we all are familiar with. Let's assume that couple of decades back  the 1st restaurant, with a seating capacity of 10 people decided that it could grow its business if it started taking home delivery orders on phone. The restaurant increased its business from 10 orders/hour to 30 orders/hour - mind you, there is no software used, just a great business idea implemented using the telephone. Recently another disruptive idea that leverages software changed the food business landscape. I am talking about Cloud Kitchen that has disrupted the food business over the last 4 years. The same restaurants that had seating capacity of 10 people could now outsource its kitchen to 10 cloud kitchens across the city and they started servicing 300 orders every hour. If you do not know what is Cloud Kitchen do read about it it most interesting software driven business disrution that many people don't know about.                                                                                                                                              Today businesses continue to be disrupted not just by a new business idea but also by software agility and innovation. Software defined disruption like Mobile apps or Cloud Kitchens have changed the food business landscape and continues to drive tremendous business value like never before.  Is it important to understand that as software is disrupting the industry, every enterprise is turning into a software company. As a software company all these enterprises need a seamless SDLC (software delivery lifecycle) process integrated with testing tools,deployment tools and application and business monitoring tools along with team that can promptly support any issue in the software delivery with a short turnaround time.

The main driver to adopt devops comes from the need of enterprise to innovate and accelerate the business. Unless the new ideas are implemented in the software and shipped fast by a unified capable team the business does not get the desired business outcome.

Key points to note about business and need for devops are as follows-

Businesses develop software for a reason: to get business value

Today every Businesses is either Digital Businesses or in process of Digital Transformation

Businesses will continue to be disrupted by software innovations & agility for example Mobile Apps

Disruption like Cloud Kitchens are changing landscape & deliver tremendous business value

Software disruption also means that enterprises are becoming software companies.

So the main driver for DevOps is the need for enterprise to innovate and accelerate the business

  



Wednesday, March 30

Will Auto Machine Learning replace Data Scientist over next few years ?

What is AutoML?

Auto Machine Learning or AutoML enables developers with limited machine learning expertise to train high-quality models specific to their business needs. Build your own custom machine learning model in minutes. Automated Machine Learning (AutoML) is tied in with producing Machine Learning solutions for the data scientist without doing unlimited inquiries on data preparation, model selection, model hyper-parameters, and model compression parameters.

How does the AutoML process work?

Auto Machine Learning is typically a platform or open source library that simplifies each step in the machine learning process, from handling a raw data-set to deploying a practical machine learning model. In traditional machine learning, models are developed by hand, and each step in the process must be handled separately. 


AutoML automatically maps the optimal type of machine learning algorithm for a given task. It does this with two concepts:


  1. Neural architecture search, which automates the design of neural networks. This helps AutoML models discover new architectures for problems that require them.
  2. Transfer learning, in which pretrained models apply what they've learned to new data sets. Transfer learning helps AutoML apply existing architectures to new problems that require it.

Users with minimal machine learning and deep learning knowledge can then interface with the models through a relatively simple coding language like Python or R. These are some standard steps of the machine learning process that AutoML can automate, in the order they occur in the process:

  • Raw data processing
  • Feature engineering and feature selection
  • Model selection
  • Hyperparameter optimization and parameter optimization
  • Deployment with consideration for business and technology constraints
  • Evaluation metric selection
  • Monitoring and problem checking
  • Analysis of results

Why is AutoML a game changer?

I think AutoML is game changer because it represents a milestone in the fields of machine learning and artificial intelligence (AI). AI and machine learning have been subject to the "black box" criticism -- meaning that machine learning algorithms can be difficult to reverse engineer. Although they improve efficiency and processing power to produce results, it can be difficult to track how the algorithm delivered that output. Consequently, this also makes it difficult to choose the correct model for a given problem, because it can be difficult to predict a result if a model is a black box.

AutoML makes machine learning less of a black box by making it more accessible. This process automates parts of the machine learning process that apply the algorithm to real-world scenarios. A human performing this task would need an understanding of the algorithm's internal logic and how it relates to the real-world scenarios. It learns about learning and makes choices that would be too time-consuming or resource-intensive for humans to do with efficiency at scale.

Fine-tuning the end-to-end machine learning process -- or machine learning pipeline -- through meta learning has been made possible by AutoML. We can say AutoML represents a step towards general AI and making AI accessible for non techy domain experts. 

Getting started with Auto ML

You can get started by trying some popular AutoML platforms like :

  • Google AutoML - Google's proprietary, cloud-based automated machine learning platform.
  • Azure Automated Machine Learning - a proprietary, cloud-based platform.
  • Auto Keras - an open-source software library developed by the DATA lab at Texas A&M university.
  • Auto-sklearn, - evolves from Scikit learn, which was an open source, commercially usable collection of simple machine learning tools in Python. You can find it on GitHub.

Would AutoML replace Data Scientist?

Like most automation, AutoML is designed to perform rote tasks efficiently with accuracy and precision, freeing up employees to focus on more complex or novel tasks. Things that AutoML automates, like monitoring, analysis and problem detection, are tasks that are faster if automated. A human should still be involved to assess and supervise the model, but no longer needs to participate in the machine learning process step-by-step. AutoML should help improve data scientist and employee efficiency, not replace them. Will AutoML reduce dependency of business on data scientist? Yes to a limit it will reduce the dependency on data scientist to do menial machine learning tasks but it also helps in enabling domain experts to use machine learning and applying across less complex tasks.

As of 2021 Auto Machine Learning is a relatively developing area and even the most popular tools are not yet fully developed. If you look back at history of software it is inevitable for automated tools to evolve and automate the mundane tasks reducing dependency on developers. Those who have entered the field of data science without programming background should be wary that one day Machine Learning will become a packaged software and the demand for ML developers my prediction is the demand for ML developers would reduce in another 3 to 4 years maybe around 2025. For every developer/data scientist entering the data science or machine learning space my advice would be to build a strong programming base along with machine learning and AI which will enable them to adapt to the inevitable change in demand of the IT industry.

 

Monday, December 20

Upgrading from Healthcare Solutions to Humancare Solution : Part-1

Oxford dictionary defines healthcare as ' the organized provision of medical care to individuals or a community'. The crisp definition does not quite explain the purpose and goal of a good healthcare system.

According to me the complete definition of 'Healthcare' should be - an integrated system that proactively delivers care to individuals. A healthcare system should store and uses patient data and clinical data to provide better insights to patients health which in turn could helps the medical profession to give better service to the patient, at a lower cost. 

From a technology providers perspective a good healthcare system uses continuous advances in technology to connect and organize the disparate entities of healthcare landscape to deliver a seamless experience to individuals and entities. Every entity in heath care landscape benefits and profits from a good healthcare system but the ultimate beneficiary has to be individual seeking healthcare services.

What needs to change for Health Care to become Human Care?

What I am trying to say is that most of the healthcare systems that exist today are focused on delivering medical services rather than health care to individuals. There is a need to build health care systems that keep the individual care at core of system design and that means life long care of every individual who approaches the system. Once a individual requires medical services he becomes part of the healthcare system and the system should proactively monitor, manage & deliver health care to individual patients.  We are talking big, we are talking about system that is built around individual health care, we are talking about building a system that reaches out to individual rather than waiting for individuals to seek medical services because the purpose of a responsible society and medical community in a vibrant democracy is to ensure good health for every individual.

So what is required of a good health care system?

1) Keep a record of all individuals from birth or from the time they register

2) Own the responsibility of maintaining medical records of every registered individual

3) Use medical records and clinical data to proactively reach out to individual for health checkups

4) Post treatment of various chronic diseases proactively monitor health of registered individuals

5) Proactively deliver medical advises to all registered individuals 

6) Share and connect individual's medicals history across health care network

Let me take example of a cancer patient who becomes part of health care system at a age of 60yrs. Let's assume that after taking treatment the patient gets well and goes home and does not feel the need to approach the health care hospital. Health care providers know that cancer is a chronic disease and needs life long monitoring. The health care systems should device a health care plan for the cancer patient and proactively connect with the individual to check the individuals health and recommend timely checkups to check 'recurrence' of cancer. Recurrence is common in some types of cancer and as healthcare expert the system has data to predict possibility of recurrence and can save lives by doing periodic checkup.

 

Another example is of an individual who becomes part of the healthcare system when he gets treated for a coronary blockage. Medical professionals and healthcare system have data to show that even after removing the coronary blockage their is high probability that the patient 'with a heart condition' may face similar medical conditions over a period of time and requires periodic checkups. The point I am trying to put across is Health Care is not just providing Medical Services, health care is about providing care for health of individuals. We as experts of IT and medicine know we can provide the Health Care in true sense by designing smart system that use the individual and clinical data and save individual's lives. Individuals who often neglect medical conditions because of lack of knowledge and ignorance can be kept in the healthcare network by proactive followups. 

There is a cost associated with building such smart systems , maintaining data and proactively connect with every individual registered in the healthcare system. This cost is very small when we compare it to the medical expenses and suffering that individual has to bear if the diseases is not detected early. Insurance companies would love to have such smart health care systems that do proactive checkups and detect a medical condition which will help them save billions in treatment of the insured individuals. The challenge is we do not have such Smart Health Care systems that have built in Care Module that benefits individuals, insurance companies as well as health care providers because everybody wants affordable health care.

Smart Health Care is need of our society because                                                 

  • Smart Health Care ensures proactive monitoring and early detection of medical issues
  • Smart Health Care saves money spent on health of every individual
  • Smart Health Care ensure limited medical infrastructure can service more individuals
  • Smart Health Care ensures insurance companies pay less on medical treatments of their insured
  • Smart Health Care uses data for predicting diseases
  • Smart Health Care can help pharma industry to develop better medicine
  • Smart Health Care can help countries eradicate many diseases/illness
  • Smart Health Care ensures healthy and productive community
  • Smart Health Care is also a right of every individual


Smart Health Care, Covid and Data

#Covid is a latest use-case that proves that a Smart Health Care system would have simplified management of Covid cases, it would have helped us give better treatment to all registered individuals and it would have given us real time clinical data to find effective treatment procedure for epidemic like Covid. After months of treatment scientist found that certain medicine was not effective for treatment of Covid because we do not have a unified system to collect data of individuals. If we had every individual registered with one or more healthcare systems we could have analyzed data in real time and within weeks we could have identified the most effective treatment procedure and saved millions of lives. In 2021 everybody understand the value of data, unfortunately we do not have a system to collect, store and derive insights from the data. 

In the next post -

I hope you have followed my thought behind this post. In the next post I plan to share a high level design of a smart health care system that is beneficial as well as profitable to every entity in healthcare system.. A system that delivers benefit to individuals, to hospitals, to insurance companies as well as the scientist and pharma companies. I am talking about changing the way we look at health care 'as a service for those who want it' and make healthcare 'an essential service that takes care of people in an inclusive manner'. The time has come to move from Health Care to Human Care and guarantee proactive monitoring of health and timely and affordable treatement to every individual, to woman, men as well as new born children by plugging them to the healthcare network. 

In a connected world no human should be disconnected from Health Care network. When our public as well as private healthcare providers unite to build a seamless heatlcare network we can really truly deliver Human Care aka healthcare with human touch and not just medical treatmen to those who reach a hospital for treatment and those who can afford the hospital expenses.



Wednesday, November 3

World Of Health Care - Top 10 challenges and opportunities in Health Care

 Top 10 challenges and opportunities in Health Care 

1. Costs and transparency. Implementing strategies and tactics to address growth of medical and pharmaceutical costs and impacts to access and quality of care.

2. Consumer experience. Understanding, addressing, and assuring that all consumer interactions and outcomes are easy, convenient, timely, streamlined, and cohesive so that health fits naturally into the “life flow” of every individual’s, family’s and community’s daily activities.

3. Delivery system transformation. Operationalizing and scaling coordination and delivery system transformation of medical and non-medical services via partnerships and collaborations between healthcare and community-based organizations to overcome barriers including social determinants of health to effect better outcomes.

4. Data and analytics. Leveraging advanced analytics and new sources of disparate, non-standard, unstructured, highly variable data (history, labs, Rx, sensors, mHealth, IoT, Socioeconomic, geographic, genomic, demographic, lifestyle behaviors) to improve health outcomes, reduce administrative burdens, and support transition from volume to value and facilitate individual/provider/payer effectiveness.

5. Interoperability/consumer data access. Integrating and improving the exchange of member, payer, patient, provider data, and workflows to bring value of aggregated data and systems (EHR’s, HIE’s, financial, admin,  and clinical data, etc.) on a near real-time and cost-effective basis to all stakeholders equitably.

6. Holistic individual health. Identifying, addressing, and improving the member/patient’s overall medical, lifestyle/behavioral, socioeconomic, cultural, financial, educational, geographic, and environmental well-being for a frictionless and connected healthcare experience.

7. Next-generation payment models. Developing and integrating technical and operational infrastructure and programs for a more collaborative and equitable approach to manage costs, sharing risk and enhanced quality outcomes in the transition from volume to value (bundled payment, episodes of care, shared savings, risk-sharing, etc.).

8. Accessible points of care. Telehealth, mHealth, wearables, digital devices, retail clinics, home-based care, micro-hospitals; and acceptance of these and other initiatives moving care closer to home and office.

9. Healthcare policy. Dealing with repeal/replace/modification of current healthcare policy, regulations, political uncertainty/antagonism and lack of a disciplined regulatory process. Medicare-for-All, single payer, Medicare/Medicaid buy-in, block grants, surprise billing, provider directories, association health plans, and short-term policies, FHIR standards, and other mandates.

10. Privacy/security. Staying ahead of cybersecurity threats on the privacy of consumer and other healthcare information to enhance consumer trust in sharing data. Staying current with changing landscape of federal and state privacy laws.

“We are seeing more change in the 2020 HCEG Top 10 than we have seen in recent years and for good reason. HCEG member organizations express that the demand for, and pace of change and innovation is accelerating as healthcare has moved to center stage in the national debate. It shouldn’t be surprising that costs and transparency are at the top of the list along with the consumer experience and delivery system transformation,” says Ferris W. Taylor, Executive Director of HCEG. “Data, analytics, technology, and interoperability are still ongoing challenges and opportunities. At the same time, executives need to be cautious, as individual health, consumer access, privacy, and security are on-going challenges that also need to remain as priorities.”  

Turning challenges into opportunities

Reducing costs means lower revenue for providers and almost all of the players in healthcare––except for consumers and payers, says Mark Nathan, CEO and founder of Zipari, a health insurtech company. So while there are many incentives to keep healthcare costs high, if consumers are provided with the information they need to improve their health and drive down their personal costs, then we could see consumers en mass making decisions that drive down costs across the industry, he adds.

“Predicting cost in the traditional health insurance environment is shockingly complex,” Nathan says. “The most advanced payers can simulate claims and predict the cost of procedures. However, as you layer in full episodes of care, such as knee surgery, it becomes much harder to accurately predict the patient's total out-of-pocket cost. Bundled value-based payments start to make cost transparency a little easier to predict, but most plans still have a way to go to get to that type of offering.”

The greatest opportunity to drive down health costs––for payers, consumers, and system-wide––is with the payer-consumer relationship, he says. “Payers have the information consumers need to make better decisions about their health and finances––if plans can build positive and trusted relationships with their members. Once a payer proves it can make valuable and trusted recommendations, the consumer can make the decisions that will not only lead to better health outcomes but also to reduced cost of care.”


Saturday, June 12

Agile vs. Scrum

Two of the most common (and often conflated) approaches to project management are Agile and Scrum. Developers often ask how Scrum and Agile  are different from one another, and how to choose the right approach for your project?
 
What is Agile Project Management?

Agile project management is a project philosophy or framework that takes an iterative approach towards the completion of a project. According to Project Management Institute (PMI) the goal of the Agile approach is to create early, measurable ROI through defined, iterative delivery of product features.

Due to the iterative nature of Agile approaches, continuous involvement with the client is necessary to ensure that the expectations are aligned and to allow the project manager to adapt to changes throughout the process.

Agile is primarily a project management philosophy centered on specific values and principles. Think of Agile broadly as a guiding orientation for how we approach project work. The hallmark of an Agile approach is those key values and principles which can then be applied across different, specific methodologies.  

If you're following an Agile philosophy in managing your projects, you'll want to have regular interactions with the client and/or end-users; you're committed to a more open understanding of scope that may evolve based on feedback from end-users; and you'll take an iterative approach to delivering the scope of work," Griffin says.

There are many different project management methodologies used to implement the Agile philosophy. Some of the most common include Kanban, Extreme Programming (XP), and Scrum.

What is Scrum Project Management?

Scrum project management is one of the most popular Agile methodologies used by project managers.

"Whereas Agile is a philosophy or orientation, Scrum is a specific methodology for how one manages a project," Griffin says. "It provides a process for how to identify the work, who will do the work, how it will be done, and when it will be completed by."

In Scrum project management, the project team, led by the project manager, consists of a product owner, Scrum master, and other cross-functional team members. The product owner is responsible for maximizing the value of the product, while the Scrum master is accountable for ensuring that the project team follows the Scrum methodology.

The Scrum methodology is characterized by short phases or "sprints" when project work occurs. During sprint planning, the project team identifies a small part of the scope to be completed during the upcoming sprint, which is usually a two to four week period of time.

At the end of the sprint, this work should be ready to be delivered to the client. Finally, the sprint ends with a sprint review and retrospective—or rather, lessons learned. This cycle is repeated throughout the project lifecycle until the entirety of the scope has been delivered. This mirrors aspects of traditional project management. One of the key differences, however, is how one creates "shippable" portions of the project along the way rather than delivering everything at the very end. Doing so allows the client to realize the value of the project throughout the process rather than waiting until the project is closed to see results.

What are the differences between Agile and Scrum ?

On the surface, it is easy to see why Agile and Scrum can often be confused, as they both rely on an iterative process, frequent client interaction, and collaborative decision making. The key difference between Agile and Scrum is that while Agile is a project management philosophy that utilizes a core set of values or principles, Scrum is a specific Agile methodology that is used to facilitate a project.

There are also other notable differences between Agile and Scrum.

Key Differences:

Agile is a philosophy, whereas Scrum is a type of Agile methodology
Scrum is broken down into shorter sprints and smaller deliverables, while in Agile everything is delivered at the end of the project
Agile involves members from various cross-functional teams, while a Scrum project team includes specific roles, such as the Scrum Master and Product Owner

It's important to remember that although Scrum is an Agile approach, Agile does not always mean Scrum—there are many different methodologies that take an Agile approach to project management.

Agile vs. Other Methodologies

While Agile and Scrum often get most of the attention, there are other methodologies you should be aware of. Below is a look at how Agile compares to Waterfall and Kanban, two popular project management strategies.
Agile vs. Waterfall

Waterfall project management is another popular strategy that takes a different approach to project management than Agile. While Agile is an iterative and adaptive approach to project management, Waterfall is linear in nature and doesn't allow for revisiting previous steps and phases.

Waterfall works well for small projects with clear end goals, while Agile is best for large projects that require more flexibility. Another key difference between these two approaches is the level of stakeholder involvement. In Waterfall, clients aren't typically involved, whereas in Agile, client feedback is crucial.

Agile vs. Kanban

Kanban project management is a type of Agile methodology that seeks to improve the project management process through workflow visualization using a tool called a Kanban board. A Kanban board is composed of columns that depict a specific stage in the project management process, with cards or sticky notes representing tasks placed in the appropriate stage. As the project progresses, the cards will move from column to column on the board until they are completed.

A key difference between Kanban and other Agile methodologies, such as Scrum, is that there are typically limitations regarding how many tasks can be in progress at one time. Project management teams will typically assign a specific number of tasks to each column on the board, which means that new tasks cannot begin until others have been completed.

Agile vs. Scrum: Choosing the Right Project Methodology

Once you have a clear understanding of what Agile and Scrum are and how they work together, you can begin to think about applying these approaches to your own projects. But, given the differences between the two, this shouldn't be a question of whether you should take an Agile or a Scrum approach.

Instead, if you decide that an Agile approach is right for a particular project, the question is: Which Agile methodology should you use? The answer could be Scrum, or it could be one of the other various Agile methodologies that exist.

To decide if Agile is right for your project, you'll need to look at the specific requirements and constraints involved. Agile was originally created within the context of software development projects and is particularly effective in this arena. With this in mind, an Agile approach will not be effective for projects with very strict scope and development requirements. However, the guiding principles of the Agile philosophy are widely used across many different types of projects.

If an Agile approach is right for your project, you will then need to determine whether or not Scrum is the best Agile methodology for your specific needs and goals. Scrum is typically best suited to projects which do not have clear requirements, are likely to experience change, and/or require frequent testing.

It's important to remember that the key to a successful project isn't just about choosing the right methodology, but executing that methodology in a skillful manner. Doing so requires an expert understanding of the methodology you ultimately decide to employ in conjunction with other critical project management skills.  To be successful in their roles, project managers also need to know how to communicate effectively, lead a team, apply critical thinking and problem-solving skills, and be adaptable to the organizational dynamics and complexities around them.

Sunday, May 23

Workday Architecture

Workday Software As A Service

Workday is considered to be a leader in HR, Payroll, and financial management services. Workday is a top SaaS-based cloud enterprise solution for performing many human resource business operations. Workday is an American cloud-based software company; it was founded by David Duffield (CEO of ERP based company PeopleSoft) and Aneel Bhushri in the year 2005. Workday headquarters located in Pleasanton, California (United States of America). The main purpose of the Workday cloud-based management tool is to provide many SaaS-based services such as managing human resources, financial management, offers new levels of enterprise agility for deploying, buying, and also to maintain the legacy of on-premise applications. The workday tool has been used by more than 200 companies (mid-level to top-level companies as well as Fortune 500 companies). The workday tool is distinguished into different modules; two modules are considered to be top-most such as Workday Human capital management and Workday Financial management modem. These two modules play a key role in providing an unparalleled agility service, easy-to-manage, and high-level integration capacity. The top partners of the Workday organization are Ceridian, Kronos, Plateau, Salesforce.com, ERP, Comerstone OnDemand, NETtime solutions, Patersons, Safeguard world internationally, Stepstone solutions, and Taleo corporations.



At the heart of the architecture are the Object Management Services (OMS), a cluster of services that act as an in-memory database and host the business logic for all Workday applications. The OMS cluster is implemented in Java and runs as a servlet within Apache Tomcat. The OMS also provides the runtime for XpressO — Workday’s application programming language in which most of our business logic is implemented. Reporting and analytics capabilities in Workday are provided by the Analytics service which works closely with the OMS, giving it direct access to Workday’s business objects.

The Persistence Services include a SQL database for business objects and a NoSQL database for documents. The OMS loads all business objects into memory as it starts up. Once the OMS is up and running, it doesn’t rely on the SQL database for read operations. The OMS does, of course, update the database as business objects are modified. Using just a few tables, the OMS treats the SQL database as a key-value store rather than a relational database. Although the SQL database plays a limited role at runtime, it performs an essential role in the backup and recovery of data.

The UI Services support a wide variety of mobile and browser-based clients. Workday’s UI is rendered using HTML and a library of JavaScript widgets. The UI Services are implemented in Java and Spring.

The Integration Services provide a way to synchronize the data stored within Workday with the many different systems used by our customers. These services run integrations developed by our partners and customers in a secure, isolated, and supervised environment. Many pre-built connectors are provided alongside a variety of data transformation technologies and transports for building custom integrations. The most popular technologies for custom integrations are XSLT for data transformation and SFTP for data delivery.

The Deployment tools support new customers as they migrate from their legacy systems into Workday. These tools are also used when existing customers adopt additional Workday products.

Workday’s Operations teams monitor the health and performance of these services using a variety of tools. Realtime health information is collected by Prometheus and Sensu and displayed on Wavefront dashboards as time series graphs. Event logs are collected using a Kafka message bus and stored on the Hadoop Distributed File System, commonly referred to as HDFS. Long-term performance trends can be analyzed using the data in HDFS.


As per google and Gartner's report, 2019 workday has considered as a leader in the data integrations. Workday acts as a middleware which will host the data integration and also transmits the data. Workday is developed to help the financial management, human resource team, and payroll management in any organization. I hope this blog may help a few of you to learn and gain valuable information in Workday.

Friday, April 30

Complex Event Processing on AWS - AWS EventBridge

Complex Event Processing on AWS

Amazon EventBridge is a serverless event bus that makes it easier to build event-driven applications at scale using events generated from your applications, integrated Software-as-a-Service (SaaS) applications, and AWS services. 

EventBridge delivers a stream of real-time data from event sources. Routing rules determine where to send your data to build application architectures that react in real- time to your data sources with event publisher and consumer completely decoupled. Amazon EventBridge enables developers to route events between AWS services, integrated software as a service (SaaS) applications, and your own applications. It can help decouple applications and produce more extensible, maintainable architectures. 

With the new API destinations feature, EventBridge can now integrate with services outside of AWS using REST API calls.

EventBridge architecture

Event-driven architecture enables developers to create decoupled services across applications. When combined with the range of managed services available in AWS, this approach can make applications highly scalable and flexible, with minimal maintenance.

Many services in the AWS Cloud produce events, including integrated software as a service (SaaS) applications. Your custom applications can also produce and consume events. With so many events from different sources, you need a way to coordinate this traffic. Amazon EventBridge is a serverless event bus that helps manage how all these events are routed throughout your applications.

The routing logic is managed by rules that evaluate the events against event expressions. EventBridge delivers matching events to targets such as AWS Lambda, so you can process events with your custom business logic.

Eventbridge working

A banking application for automated teller machine (ATM) produces events about transactions. It sends the events to EventBridge, which then uses rules defined by the application to route accordingly. There are three downstream services consuming a subset of these events.

Sample ATM application architecture



Wednesday, April 28

Chef - The Expert Cook of DevOps

A Devops engineer spends more time in deploying new services and application, installing and updating network packages and making machine server ready for deployment. This takes tedious human efforts and requires huge human resources.By using configuration management tools like Chef or Puppet you can deploy, repair and update the entire application infrastructure using automation.

Chef is an automation tool that can deploy, repair and update and also manage server and application to any environment.

What is Chef?

Chef is a Configuration management tool that manages the infrastructure by writing code rather than using a manual process so that it can be automated, tested and deployed  easily. Chef has Client-server architecture and it supports multiple platforms like Windows, Ubuntu and Solaris etc. It can also be integrated with cloud platform like AWS, Google Cloud, and Open Stack etc. 

Understanding Configuration Management

Configuration Management

Let us take an example of a system engineer in an organization who wants to deploy or update software or an operating system on more than hundreds of systems in your organization in one day. This can be done manually but it may cause multiple errors, some software’s may crash while updating and we won’t be able to revert back to the previous version. To solve such kind of issues we use Configuration management tools.

Configuration Management keeps track of all the software and hardware related information of an organization and it also repairs, deploys and updates the entire application with its automated procedures. Configuration management does the work of multiple System Administrators and developers who manage hundreds of server and application. Some popular tools used for Configuration management are Chef, Puppet, Ansible, CF Engine, and SaltStack.

Why I prefer Chef?

Let us take a scenario, suppose we want our system administrator to install, update and deploy software on hundreds of system overnight. When the system engineer does this task manually it may cause Human errors and some software’s may not function properly. At this stage, we use Chef which is a powerful automated tool which transfers infrastructure into code.

Why Chef

Chef automates the application configuration, deployment and management throughout the network even if we are operating it on cloud or hybrid. We can use chef to speed up the application deployment. Chef is a tool for accelerating software delivery, the speed of software development refers to how quickly the software is able to change in response to new requirements or conditions

Benefits of Chef

Accelerating software delivery

 By automating infrastructure provisioning we automate all the software requirements like testing, creating new environments for software deployments etc. becomes faster.

Increased service Resiliency, 

By making the infrastructure automated we can monitors for bugs & errors before they occur it can also recover from errors more quickly.

Risk Management

Automation tool like Chef or Puppet lowers risk and improves compliance at all stages of deployment. It reduces the conflicts during the development and production environment.

Cloud Adoption  

Chef can be easily adapted to a cloud environment and the servers and infrastructure can be easily configured, installed and managed automatically by Chef.

Managing Data Centers & Cloud Env 

Chef can run on different platforms, under chef you can manage all your cloud and on-premise platforms including servers.

Streamlined IT operation & Workflow 

Chef provides a pipeline for continuous deployment starting from build to test and all the way through delivery, monitoring, and troubleshooting.


In summary Chef tools help IT teams adopt modern day best practices including:

  • Test Driven Development: Configuration change testing becomes parallel to application change testing.
  • AIOps Support: IT operations can confidently scale with data consolidations and 3rd party integrations.
  • Self-Service: Agile delivery teams can provision and deploy infrastructure on-demand.


MUSTREAD : How can you use Index Funds to help create wealth? HDFC MF Weekend Bytes

https://www.hdfcfund.com/knowledge-stack/mf-vault/weekend-bytes/how-can-you-use-index-funds-help-create-wealth?utm_source=Netcore...