MITB Banner

Branded Content

Enterprise of the Future: Disruptive Technology = Infinite Possibilities

Everyone is eager to explore GenAI as it reshapes roles and increases efficiency by augmenting people’s capabilities.
Listen to this story

Artificial Intelligence (AI) has been around for many years and is no stranger to the technology world. Yet, the introduction of generative AI (GenAI) in 2022 has sparked new interest in the transformative potential of AI and its application across sectors. 

Clearly, GenAI is challenging the way we traditionally operate, spearheading an acceleration of innovation underpinned by increasing investments and interests around AI usage. Everyone is eager to explore GenAI as it reshapes roles and increases efficiency by augmenting people’s capabilities.

Generative AI as capability enabler and augmenter

Areas with scope for augmenting the capabilities and potential of people – call-centre agents, frontline technicians, back-office employees, or software developers – are the ones where GenAI is steadily gaining traction. The common use cases include tasks such as summarisation, sentiment analysis, Retrieval-augmented generation (RAG)-based knowledge searches, and automated email responses, to name a few. 

The extension of these use cases is copilots or workbenches that enhance the way people traditionally work. For instance, business analysts, particularly in domains such as finance, are required to browse through various internal and external data and information sources, search through structured and unstructured data in multiple formats to perform analysis, derive meaningful insights, summarize, and prepare a report or presentation for business leaders. 

These activities are now being supported by GenAI-powered workbenches. These workbenches leverage GenAI agents to simplify the process of searching across repositories, pulling data from multiple sources with diverse structures and formats. They provide analysts with user-friendly interfaces and facilitate efficient data access and navigation via natural language searches, allowing them to focus on essential aspects of validation and analysis. 

We see a similar transformation happening in the software development area, where now we have copilots that are helping speed up the development process with code suggestions and recommendations. While these use cases increase productivity and bring in efficiency improvements, there’s a huge potential for improving and enhancing employee experience in an organisation. 

This is where large language models (LLMs) can play a vital role.

Current outlook: The need to simplify enterprise application interfaces

In the current state, employees are required to navigate multiple enterprise application interfaces and browse many systems to get information for their day-to-day activities. It could be a simple activity such as fetching clarification on leave policies, checking for the leave balance or reporting an incident to have a laptop or printer issue fixed. 

To accomplish their daily responsibilities, employees often end up searching for standard operating procedures and other relevant data and knowledge. In general, to perform their day-to-day tasks, they must go through multiple enterprise application interfaces. This often results in a steep learning curve for the employees, necessitating them to remember where specific data and information resides, to understand the functionality of multiple enterprise applications and master the way they operate. 

This fragmentation of information across different data sources and the labyrinth of applications that need to be navigated to perform daily activities leads to inefficiencies and confusion that ultimately impact productivity. Moreover, changes within an organisation necessitate training on new systems and new ways of working, requiring employees to learn and adapt continuously. 

The promise and the potential of LLMs

Large language models (LLMs) hold the potential to revolutionise current enterprise systems by enabling conversational interfaces for employees and software agents that can dynamically decide the next course of action based on what the employee is looking for. LLMs enable the development of applications that can understand and process natural language inputs allowing users to interact more intuitively.

These LLM-based applications will have the ability to understand context and intent and direct the relevant application flow, freeing the user from the necessity to understand and remember complex interfaces and menu systems. This shift will enhance productivity by reducing the time and effort formerly required to navigate through complex systems. This will also simplify and enable organisational changes without impacting the end users.

One of the most compelling aspects of integrating LLMs into the system is their ability to shield employees from the underlying complexities of multiple systems, while enabling them to complete the tasks at hand quickly and efficiently. These LLMs dynamically interpret the requests and perform the relevant tasks based on the context of the interactions.

Future outlook: Microservices integration with multi-agent LLMs

The future state will embrace and evolve microservice-based architectures to the next level where we will have LLM agents mapped to specialised tools doing specific atomic tasks that will be orchestrated on the fly. As users converse with the application, an orchestrator agent understands the context and intent, and hands over control to other agents that will in turn converse with the user and collaborate with the other agents to perform the required functions and finish the task at hand. 

In the future, LLMs will play a crucial role in making our workplace more intuitive, flexible, and efficient. The journey toward a user-friendly intelligent enterprise has just begun, where the possibilities are as extensive as the large language models themselves. 

As we continue to explore and integrate GenAI into our work, it becomes imperative to leverage this disruptive technology in a controlled and regulated manner. What lies ahead is exciting as enterprises now stand on the brink of significant disruption in working practices. 

The views in this article are those of the author and do not necessarily reflect the views of the global EY organisation or its member firms.

Contributed as part of AIM Branded Content. Know more here.

This article is contributed by
Picture of Shakuntala Gupta

Shakuntala Gupta

Technology Consulting – AI & Data leader, EY Global Delivery Services India LLP
More from AIM

Subscribe to Our Newsletter

The Belamy, our weekly Newsletter is a rage. Just enter your email below.