Bite-sized Analysis covering the Business Impact of Technology
Within the cloud eco-system, SaaS is the biggest revenue generator followed by IaaS and PaaS. There is a strong push from all software publishers to sell via SaaS vis a vis perpetual licensing model. And the model has worked well both for vendors and users. For vendors, it has allowed them to expand their reach cutting across large, mid-sized and small companies. It also helps them get a steady revenue stream from a large base of customers which helps them work on product updates and pass it on to their customers.
The benefits to enterprises are also well documented. After all, SaaS has been a runaway success because it was a win-win for both the stakeholders.
But like any piece of technology, there are challenges associated with managing SaaS applications. While most enterprise decisions are taken with a lot of due diligence, it is not possible at times to foresee some of these challenges. In fact, in certain cases, even if an organization does, they are left with no choice but to go with particular software for a lack of alternatives.
But to be fair, in spite of its dream run, SaaS is still a relatively new concept and many of the teething issues will hopefully subside over time.
In addition to the challenges mentioned, another issue that many large organizations face is Shadow IT, a phenomenon where employees end up buying SaaS services without the knowledge of their IT team. This can lead to inefficient processes and a challenge in managing software assets. Solutions like SAM (Software Asset Management) can help keep a tab on such scenarios helping them manage their software footprint regardless of where it resides, i.e. on-premise or on the cloud.
The retail sector has transformed over the last decade. An industry in which the unorganized sector has traditionally dominated is going through one of the most exciting phases. And a big reason for this tectonic shift in large parts can be attributed to e-commerce, where technology is the backbone.
While the unorganized might have the larger share, the organized sector has made significant strides by leveraging technology across several business touch points.
And when we speak about the organized sector, one generally tends to think about e-commerce only. Still, even the brick-and-mortar setups have also adopted technology in large measures.
Like in the world of technology, where hybrid is the preferred route, a similar analogy could be drawn in the retail sector. With the advent of e-commerce platforms, many feared this would mark the end of brick-and-mortar setups. However, recent studies throw interesting inferences. One data point suggests that many people search online but eventually buy from retail outlets. This indicates that retail’s future lies in “click and mortar”, a hybrid approach wherein a business has offline and online operations.
Analytics has been an essential tool in the transformation of this sector. It would not be unfair to say that the best-known use cases of analytics have been in the retail industry.
Any technology uptake is not a result of a push mechanism; instead, showing tangible business benefit has only yielded business success. For instance, there is a reason why both offline and online retail platforms exist. While offline provides the customer with a complete shopping experience (also supported by the perception of getting authentic products), online platforms, on the contrary, are a rage because of convenience, range and price.
Both the formats need to have a tight grip on parameters like supply chain, inventory control, and trend predictions, among other things. Analytics has helped immensely in these areas. From historical data referencing to forecasting, analytics has made its usefulness very evident for this sector. And with the industry poised to undertake new and emerging technologies like VR, and Web 3.0, the possibilities are infinite.
BI projects, if successfully implemented, can be game changer, but it is not easy to get them right on the first go. Moreover, when it comes to SMBs, every decision matters as they like to avoid any spillage.
BI projects are perceived to be in the confines of large organizations as they have the wherewithal and the budgets to undertake these projects. However, a BI project need not necessarily be expensive, and even if it is one, if properly executed, it always comes out on top as far as ROI goes.
Almost all organizations these days start capturing data in some form from the day of their inception, as they know a critical piece to their success is synthesizing this information into a valuable strategic asset.
A well-articulated BI strategy can alter the growth trajectory of a start-up by providing evidence-based decision-making, reducing expenses, manage Supply Chain, among other things. Since things are nascent, getting them right is easier compared to an established organization.
While it is great to capture and eventually churn meaningful insights, it typically requires a dedicated workforce and infrastructure. And this is where it gets a little complicated for SMBs as they may not be in a position to do so. A natural argument would be to outsource this entire function, but it is not as easy as it is made to sound.
And with data, there is always a concern around security. Therefore, it is essential to strike a balance between data agility and security & governance.
Some ways to manage issues around BI projects are:
5G is seen as the enabler of the next phase of transformation for many sectors. Healthcare is no exception. While the concept of mobile health (m-health) has been there for a while, what 5G does is bring many potential use cases in this sector to fruition.
When one thinks of 5G, three things come to mind, reliability of connection, security and ultra-low latency. And these are precisely why 5G can bring a revolution to this sector which was earlier not even possible.
As India gets ready for its first rollouts, it would be interesting to watch the use cases championed for this sector. The initial foray would be in Tier 1 towns, with connected ambulances emerging in initial trials.
However, the real value of 5G will be in its ability to bridge the enormous gap that currently exists between urban metros and the rural sector. Quality healthcare is a big problem in the hinterlands. India has less than two hospital beds per 1000 people.
But to build the infrastructure for such services comes at a cost. It will be interesting to see how this is managed. The government obviously will have a significant role to play, the only concern being whether the operators will be willing to invest in the required infrastructure.
Every country will have its unique set of priorities and challenges, and ours is no different. While there are some great healthcare providers in the country, access to those services is confined to a few and comes at a steep price.
The possibilities of 5G are limitless in healthcare, but it needs to be backed with intent.
MPLS (Multiprotocol Label Switching) has been the network of choice for over two decades now for most enterprises. Reduced network congestion, increased uptime, and security are some of the reasons why MPLS became a quick favourite.
But the MPLS market has reached a point of stagnation as far as market revenue growth and will have negligible CAGR in the next 5-year arc. The last two years have been particularly bad as companies decided to reduce their existing MPLS links while there were hardly any new deployments. In fact, due to the lockdown, many branches were also shut down. So, while work from home or Hybrid work might have helped some technology markets, that cannot be said for the MPLS business.
As a result, the overall enterprise data services business will have a relatively tepid growth rate as MPLS has had the largest share of this market (almost close to 1/3rd of the overall enterprise data services market). Going by this trend, it would not be surprising that MPLS gets upstaged by ILL (Internet leased line connectivity).
Also, the MPLS market has been battling price erosion for some time now, affecting the profitability of the business line and eventually impacting the growth rates. As MPLS is the preferred connectivity option for most large enterprises (SMBs tend to prefer either leased line connectivity or some also opt for broadband), the scope for new deployments is also limited.
For a telecom operator, while MPLS is too big a market to neglect and will remain a mainstay for the next couple of years, an increased focus on ILL, point-to-point connectivity (DLC or ethernet-based connectivity) can be expected.
Challenges with RPA Implementation
Whenever any tech product or solution is launched, there are bound to be some challenges in market acceptance and implementation. These are teething issues which happen across all sectors. The quicker they get resolved, the better the prospects for growth.
RPA also has its fair share of challenges. The most significant is how many perceive it as a replacement for the human workforce, i.e. resulting in employee resistance.
From the era of the industrial revolution to the modern world of the metaverse, there has always been the apprehension of how machines will replace human beings and rob them of their livelihood.
So rule number one for successful RPA deployment is to understand the people that work in an enterprise. It is vital to give them the reassurance that the time saved from doing repetitive tasks can actually be used for creative and strategic work.
Also, IT and the functional departments must drive such initiatives equally. And for that, it is important to choose a suitable business case for early success which paves the way for enterprise adoption. Fostering a culture of learning new skills to take advantage of new technology should not only be the purview of IT, but all the stakeholders involved.
Security and privacy are horizontals which are always critical to get right no matter which technology deployment is suggested. Since RPA usually deals with process automation tied with business-critical data, all the necessary compliance certifications should be in a place like PCI compliance, GDPR etc., as per your industry requirements.
Clear communication of the processes involved, i.e. educating the end beneficiary, evaluating the solutions in the market, and facilitating the final deployment, helps smooth down the challenges that typically arise in such transformative processes.
If data is the new oil, then the content is the finished product from the refinery. In most knowledge-based industries, managing content can be a deal breaker. However, ECM solutions don’t come cheap, so due diligence is required before any purchase decisions are taken.
One common theme which cuts across any tech conversation is digital transformation. While DT can mean different things to different organizations, the objective is to streamline, automate and increase business productivity, among other things.
One of the earliest forms of digital transformation that many organizations carried out was to either go paperless or enhance process visibility by digitizing many paper-driven processes. And ECM, to a large extent, provides these solutions. The goal of an ECM solution is to manage the entire life cycle of an enterprise’s content, which could include images, structured or unstructured data, to reduce the risk of data loss and thereby improve productivity.
But the cost of a typical ECM solution could be a deal breaker. Typically in the perpetuity software licensing model, there are many elements, i.e. license fees, implementation fees, hosting, maintenance and training to highlight the broad ones.
But compare that to the SaaS-based model, the cost elements you need to tackle are licensing fees (opex-based), implementation and training. The fundamental reason is that in legacy applications, you must manage both the front end and the backend infrastructure requirements (hence the term monolithic). This may seem like an oversimplified comparison in favour of cloud, and ideally, any enterprise-level decision-making should be done based on a proper TCO analysis.
The enterprise connectivity market has witnessed changing patterns when it comes to different kinds of connectivity services.
There was a time that this market was heavily dependent on MPLS services. While MPLS remains the biggest contributor in terms of revenue, its rate of growth has reduced considerably, primarily because of market saturation and price cuts. On the other hand, point to point to connectivity services like DLC or ethernet services have suddenly picked up the pace on account of OTTs. This market has seen the most considerable positive uptake in the last few years.
The Internet Leased Line (ILL) market has been steadily growing over the past few years. Going by the trends we have seen, maybe in a year to two, the ILL market will be the most significant revenue contributor in the enterprise connectivity services business, toppling MPLS services.
Cloud adoption and application proliferation have helped in this steady growth part for ILL services as the requirement for steady internet services are constantly increasing. The need for dedicated internet is highest amongst OTTs, Banks, Fintech, Ecommerce and IT/ITes companies.
In fact, in some cases, it has been observed that small to mid-market organizations prefer ILL services for branch connectivity to drive their digital transformation initiatives. This was earlier done primarily over MPLS.
The growth story is anticipated to continue by large enterprises and SMBs; however, how they use these services might differ. In the case of large enterprises, they would look at ILL for branch connectivity as well as a backup link. On top of this, OTTs have suddenly spiked up the growth of this sector as transaction volumes are enormous.
In the SMB market, many consider ILL as the primary connectivity. And given the base of this segment, it augurs well for the market. With efforts around cloud and digital transformation on the rise, this market is expected to grow in the 3-5 years.
Cloud adoption in India has come a long way. From trepidation around security to initial adoption confined to non-critical applications, we have covered a lot of ground in the past decade.
When we speak about cloud, one must bear in mind what kind of application we plan to host as there are typically three kinds. One which helps run the business, then there are those which manage certain aspects of business, and then there are those which support various non-critical functions.
Regarding the first kind, security, latency, and predictability become essential. So, while choosing the right service provider is necessary, the network fabric that delivers these infrastructure services is equally important.
The most common way of connecting to any cloud services is through an internet connection running on a secure network. But when the question comes to mission-critical applications, the connection's security, performance and reliability also need to be looked into. This is where cloud interconnect comes in, a physical connection between an enterprise core network with a cloud interconnect (typically a colocation facility). This is the surest way to receive secure and reliable connectivity for your most critical workloads.
Since most enterprises these days have a Hybrid IT model, exposure to cyber risks is high. Point-to-point connectivity is a more secure approach.
Also, in many industries, latency matters. Thanks to a direct pathway, the latency is reduced in a cloud interconnect.
The present and future of business are based on a hybrid IT environment, but as we have seen in various deployments, it is fraught with complexities. But direct connectivity might be a way enterprises can explore as the systems mature and the cloud becomes more integrated with our technology stack.
In a world where microservices and containers occupy high mind share and are seen as the future of how applications should be built, legacy applications have managed to create a negative perception around them. But the reality is these legacy applications support most business-critical functions. And over the years, many deep investments have gone in to build, support and grow these applications. In many scenarios, they are hard-wired with the very functioning of an organization (take, for example, core banking solutions).
While the merits of microservices architecture need no retelling, it is also essential to understand that all legacy applications cannot be shunted. And somewhere, application modernization is a middle path which tries to bridge the gap that legacy has created by imbibing more modern tools and infrastructures.
Typically, there are three ways of going about this process. In the first one, called “the lift and shift process”, as the name suggests, not many changes are done to the code, and the application is lifted “as is” and put into new infrastructure. In the second one, called “refactoring”, the process involves significant redevelopment of the code, to the extent of completely rewriting the code. Finally, you have a process called “replatforming”, which is somewhat of a middle path between the first two methods. While you don’t rewrite the entire code, you ensure that the application can take advantage of a modern cloud platform.
But the most crucial aspect remains the application identification, which needs to be modernized. Usually, an application modernization process involves significant costs and effort. Often, an application is so strongly intertwined with the system infrastructure that decoupling it does not justify ROI. Hence, choosing the right candidate matters!
We have had several cloud outages in the last year. And in most of the cases, the issue was found to be around the network. Enterprise technology has taken giant strides, but managing the corresponding growth in complexities has not been easy.
Take, for instance, the Data Center. There has been a massive transformation that has taken place over the past decade. From having data centers across multiple locations to consolidation to virtualizing infrastructure to adopting the cloud, we have come a long way. And somewhere in this process, virtualization has played a pivotal role. So much so that one could call it the chassis on which this entire movement took place.
While server virtualization has been the front runner in adoption, network virtualization has also found a favourable response among enterprises. If you follow the pattern, the fundamental switch has been around the use of software in managing your physical infrastructure, be it your servers, storage or network, the foundation of any data center fabric. As far as SDN goes, the reasons for adoption range from cost optimization, fast failover, and network automation; essentially, most enterprises are trying to tackle the challenge brought about due to the complexity of managing our networks due to data-intensive and bandwidth-hungry applications.
As a solution, SDN finds its application within the data center (over LAN) and has also found success in connecting geographically distributed environments through SD-WAN.
With everything being software-driven, SDN will witness higher adoption in time. And with multi-cloud emerging as an industry standard in the coming years, SDN will be a natural choice.
As employees have evolved and learned to deal with new ways of work, enterprises are being compelled to implement new IT strategies to suit the work culture requirements of their employees. However, matching employee expectations sometimes becomes a challenging task for organizations as there is a constant risk of possible security breaches or possibilities of compromise on the company's sensitive information.
Striking the right balance between what employees want and the level of flexibility that organizations can provide without compromising on the company's data/resources becomes a key focus for most of the IT leaders of the organization.
When the pandemic hit, organizations did what they could do best to keep business running. IT teams worked to support their employees in every possible way they could. IT decision-makers had to make certain swift decisions to accommodate the remote working of employees. However, in the present scenario, as the trend of work-from-anywhere is catching up and hybrid work is becoming a reality, businesses need to plan for the long term to create a sustainable modern workplace.
Organizations are revisiting their IT policies and implementing new strategies to create and support a sustained workplace transformation.
Businesses on the journey of digitization should ensure that workplace modernization becomes a core component of this ongoing quest to achieve business excellence. Having a clear workplace modernization roadmap helps organizations minimize the business disruption risks in the future. This can be achieved by assessing the current IT infrastructure, employees' changing needs, and how businesses can cater to these needs without compromising on business interests.
As cloud adoption increases in India, there has been a corresponding increase in the complexity of the projects undertaken and the maturity of the services requested. More often than not, these requirements stem from cloud natives or enterprises with large IT teams who are equipped to manage such turn-key projects.
We have come a long way from teething issues during your first deployment to managing multiple cloud environments. And this growth trajectory is equally mapped by cloud service providers. Beyond the standard compute, storage and network offerings, the library of service offerings are astounding. The discerning customer who has been keeping a close track will also agree that all of these big players (AWS, Microsoft Azure, Google etc.) have their sweet spots. While there are multiple reasons organizations prefer having a multi-cloud environment, the ones that come to mind are having access to an array of feature-rich services and avoiding potential vendor lock-in.
Agreed, multi-cloud promises a multi-vendor federation but is not easy to manage, or so it seems. Cloud, in its infancy, promised that the future of IT would be agile and swift and that vision should not be compromised as the market matures and evolves.
One has to get their multi-cloud management right to justify all the investments in the cloud and bring to light the promise that it has to offer. We are looking for a central console that gives you control over your disparate cloud environments.
An ideal framework for a multi-cloud management platform should be able to:
A successful multi-cloud framework should have a catalogue of services which are easy to provision by clearly articulating your cloud architecture.
The telecom sector is going through an exciting phase. With the upcoming 5G auction (26th July), a lot is at stake for the industry. So far, the fortunes of the sector have been decided by the consumer business as it generates more than 80% of the overall revenues ( maybe even more). But that trend is changing with the b2b or the enterprise business picking up steam.
Think about it, when we speak or hear about 5g, how many times do we discuss consumer use cases? Compare that with the b2b sector. The opportunity clearly lies in this segment. And that might be the big difference as we go forward with established players having a much stronger footing amongst enterprises as compared to newer entrants.
From a service breakout perspective, mobile services are the most significant revenue contributor (upwards of 60%), but the growth levers in this segment are SMS services and M2M connectivity.
The future of enterprise mobile services will pivot around IoT, and we will tell you how. Most of the use cases that we come across 5g are around IoT. The growth of SMS also stems from a significant push of M2M connectivity or A2P (application to person) messaging.
The timing is right for the telecom sector as we step into the future. Advancements in AI, 5G and IoT, in conjunction with their brand and high customer trust, can help telecom companies emerge as Large System Integrators (LSI). They are equipped to provide consulting, advisory, and systems integration (SI) services. Telcos will have a significant coordination role in the value chain. Pivoting the legacy business will require capitalizing on dedicated core networks, developing the right set of ecosystem partners, and fostering IoT agility to build customer trust.
The V-SAT connectivity market is one of the smallest (about 4-5%) among enterprise connectivity services. The V-SAT business took a beating in the past two years; in fact, the market reduced in size.
BFSI, Government & Media contributes to more than 3/4th of V-SAT requirements in India. With BFSI leading the pack with 40%.In BFSI, the primary requirement comes from ATM installations, whereas, in the Media & Entertainment sector, satellite cinema distribution is a key use case.
As far as the BFSI sector goes, a small percentage of loss could be attributed to 4G replacing VSAT broadband at various ATM sites.
Similarly, the shutdown of movie theatres also led to the degrowth of this market.
Hopefully, with the worst behind us, the V-SAT market is expected to return to its growth trajectory; however, it will be tepid.
However, over the last year, there has been a lot of movement from a provider's point of view. Apart from existing players like Airtel, Reliance Jio, Tata Nelco and some global players like Amazon (Project Kuiper) and Elon Musk's backed Space X see potential in this market.
Apart from the three industries which drive V-SAT uptake in India (BFSI, Government, Media & Entertainment), a prominent use case for satellite broadband would be to provide connectivity to remote locations. But a lot rides on how these operators price these services. Currently, there is a vast differential in pricing regarding connectivity through V-SAT and 4G or fixed-line broadband. While V-SAT-based connectivity is priced at more than 1000/- per GB, 4G and fixed-line broadband are one of the cheapest in the world (ranging from 2/- to 15/-).
Satellite Internet will need the government's support in simplifying the clearance process, creating a conducive eco-system, and eventually tying it with the larger aegis of the "Digital India" program.
Any conversations around technology themes like Workspace Modernization, Digital Transformation or Cloudification are not complete unless the security aspects are discussed.
Not a single day goes by when we don’t hear about some security breach. Some are planned, while few are accidental, but the loss incurred either way is significant.
The past two years have been significant in digital consumption worldwide, and India is no exception. There are hardly any companies which do not have a digital transformation mandate. Cloud adoption and penetration have hit an all-time high and will only grow in the next few years.
Over the last few years, a lot of emphasis was given to securing the perimeter as most of us used to work and access applications from within the corporate network. But with remote work on the rise, endpoint protection again finds reckoning amongst most tech decision-makers.
As the volume and the levels of security breaches have grown, the solution coverage also has seen a strategic shift in how it is administered.
Until about a few years, endpoint security meant anti-virus, which was seen as a stand-alone product designed to protect a single endpoint (laptop, mobile, desktop, IoT etc.). With the advent of endpoint protection platforms, one does not see the endpoint in isolation but as an integral part of the enterprise network, providing visibility to all the endpoints connected from a single location.
Further, features like automatic updates, behavioural analysis etc., make today’s platforms more holistic.
The endpoint is still the most vulnerable, with more than three out of four successful breaches originating at the endpoint. And with remote work becoming a standard practice, more and more employees are connecting to their corporate networks from devices outside their office, forcing many CISOs to rethink their security posture. With zero-day attacks becoming more common, managing endpoints is becoming more critical.
Two topics which are doing the rounds in the enterprise tech space are Data Center Modernization and the confederation of the multi and hybrid cloud. And one technology which helps in stitching these pieces together is Hyper-Converged Infrastructure (HCI).
Some big names hog the limelight, but it is important first to understand your requirement and then map it to the solutions capability framework.
While tech specifications are a critical element to the success of the deployment, one also needs to factor in the complexity of the entire process. One of the biggest challenges in managing a legacy-based three-tier infrastructure is the manageability and coordination with multiple vendors to achieve the desired results. Hence, one of the foremost principles to keep in mind is that your HCI solution should be simple and easy to use. Please remember that was one of the reasons why you went for it in the first place. The purpose of HCI is to simplify the management of IT. HCI should not build to your challenges but rather help you resolve them.
Given today’s IT posture for most large enterprises who have to deal with a multi-cloud hybrid IT environment, one must try looking for solutions which provide a single pane of glass view of your compute, storage, network fabric and the hypervisor which manages all of it.
When one speaks about HCI, more often than not, it is said in reference to Hybrid Cloud because HCI gives you the closest “cloud-like” experience in an on-premise deployment.
Another benefit of a simple interface is that it reduces risk instances. That results in a significant leap in innovation and a quicker time to market.
And lastly, please bear in mind regardless of which infra that we speak about (bare metal, public or private cloud) heart of the entire tech juggernaut is the application. So it is vital that no matter which HCI solution you choose, it should allow all enterprise applications to run, regardless of the scale of operations.
Cloud adoption in India has been steadily growing over the past few years. The pandemic worked as a catalyst and pushed the levers of growth. Public Cloud services found buyers across a broad section of companies, i.e. large, mid-market to SMBs.
When one thinks about SMBs, there is a notion that cloud is a natural choice given its tenets. Be it access to the latest technology, frequency of updates, to benefits obtained from an opex-based commercial model; the possibilities are limitless. While true in many parts, there are adoption challenges in this market segment too.
First, we need to understand the constituent mix of this segment. Without getting into verticals, one needs to understand that this segment consists of new establishments (comparatively lesser) and many organizations that have been in business for years.
For most new establishments, since they do not have any legacy, their choices will always be skewed towards cloud services. In today’s world, everyone wants to focus on their core business while the rest is best managed by specialists. More often than not, these companies are pretty tech-savvy. Ownership and decision-making are done by young entrepreneurs who see technology as an enabler and, in some instances, the key differentiator to their success.
That leaves us with the other lot, companies that have existed for a while. And they constitute a relatively large number of this segment. While they also have adopted cloud in parts, their challenges are more than the new lot. While it would be wrong to conclude that they are inadept to understand technology, the more significant issues are around managing legacy infrastructure, challenges in revamping things and getting the necessary management support.
What needs to be understood here is, that the cloud is a means to help you manage your business problems and not add to them. In many small organizations, the perception is that moving to the cloud translates to managing “one more thing”.
So above all else, proper change management is required to gravitate from an on-premise world to a federated cloud environment.
The partner ecosystem has a pivotal role here as it primarily caters to the requirements of SMBs. In addition, there is a growing trend to focus on digital channels. As the market continues to mature, both will have a role in taking the SMB segment to a critical mass.
Process automation has long been a central theme when an organization takes up any digital transformation initiative. While the entire premise of RPA freeing up their human counterparts to do the mundane, repetitive tasks is quite well known, adopting any new technology is also dependent on various other factors like cost, ease of deployment etc. Let us evaluate where RPA stands vis a vis some of these key parameters.
While India has one of the highest mobile services penetration globally, the same cannot be said about fixed broadband. India has one of the lowest penetrations (still in the single digits). A reason for that could be the stupendous success that mobile services have witnessed. In a way, fixed broadband did not find a customer base or a use case beyond urban metros. But then that is the consumer market. And as we all know, the dynamics of the enterprise business are altogether different.
So far, enterprise-grade broadband has not yet reached critical mass, which effectively translates into a huge opportunity for service providers. But then again, how we see fixed broadband being used will be segment-specific. The SMBs will be the ideal market for mass adoption, but that does not mean there is no opportunity among large enterprises.
Regardless of whether consumer-grade or business-grade connectivity goes, we have made stellar progress as far as the connection's speed is concerned. But unfortunately, that cannot be said about the quality of the connection. We are yet to achieve the levels that most organizations expect.
While SMBs may not be so particular about the quality of the connectivity, for large enterprises, this is a no-compromise parameter. But with things becoming better, we see bright possibilities. Suppose these connections come with a particular commitment to quality or what the industry refers to as SLAs. In that case, larger establishments can see a reasonably high uptake of broadband services. Initially, they might not replace their existing MPLS links but will consider them as a backup link or for branch connectivity.
Regarding verticals, the fight for the top slot is usually between BFSI and IT/ITes. These two industries always lead from the front and have a high technology propensity. But given the current quality issues with fixed broadband, the BFSI sector has been a little muted in terms of adoption. But if the quality of service keeps improving (as it is), this sector could have enormous potential.
The retail sector is the clear leader as far as usage is concerned, as large retail companies have a high number of branch outlets across different cities, making many of them go for broadband connection as the secondary connection if not primary.
The Media & Entertainment (M&E) sector is one of those sectors which finds its relevance regardless of the times we live in. In fact, it is appropriately called the sunrise industry.
Like the world of fashion, this is a sector prone to constant change. Being a completely consumer-driven industry staying relevant and having a high recall value is of paramount importance.
The M&E sector is constantly evolving to address the demand for high-quality output while working on multiple levels of constraints such as production environments, aggressive timelines and a distributed eco-system.
In order to provide a world-class user experience, the M&E sector has long been a big spender on cutting-edge applications. And to deliver these class-leading applications it is important that they get paired with high-performing client devices (laptops or desktops).
With a lot of focus on an immersive experience, the role of devices will also change. A lot of emphases will be on creating an eco-system which harbours the key tenets of communication between different appliances, devices and solutions. Be it movies (interactive media), OTT, or gaming the frontier of experience is quite significant. And all of it relies very heavily on technology. Yes, one can never ignore the creative elements in the entire scheme of things, but technology is helping realize the visions of many industry stalwarts.
In today’s world, where apps have taken centre stage (in fact, not ever apps, it is the experience that matters) a moment of thought has to go to the hardware infrastructure that supports all of these heavy and demanding applications. Whether it is game development, content production or broadcasting, high-end computing devices become an integral part of the development process. Especially these days where the employee experience matters a lot, having a solid platform to create, innovate and transform ideas matters.
The enterprise fixed voice business has been a steadily decelerating market. The pandemic only compounded the challenge. The primary reason for this steady decline could be attributed to the alternatives in the market today and how workforce transformation is taking place in India. There is no strong use case that we can see in the foreseeable future, which will help this segment revive.
The only exception is SIP Trunking services ((Session Initiation Protocol trunking). SIP takes advantage of the IP network and connects the on-premise phone system with the PSTN. This is a growing market since this takes advantage of the internet while taking advantage of your legacy telecom infrastructure like ethernet fibre or MPLS. Moreover, through SIP trunking, one can achieve more than just voice communication.
Many cloud telephony providers use SIP Trunking services to provide end-to-end communication services to their end customers. Taking a cue, most CSPs are also going through that route and position SIP trunking as a UC service instead of as stand-alone fixed voice service. A UC platform managed and operated by a CSP will have a strong foundation in QoS and security.
The UC route benefits the enterprise, too, as it does not need to invest in multiple networks to perform disparate things like voice, video or chat. Also, most organizations are familiar with cloud-based services; hence they expect elasticity and scalability when looking for the next set of technology procurements. SIP Trunking allows companies to scale as and when required, supporting concurrent calling seamlessly.
And finally, a big reason why SIP Trunking will continue to remain relevant in the near term is that a fixed-line connection gives a sense of credibility regardless of the industry to which an organization belongs. However, most startups are hesitant to invest in legacy-based fixed-line services. SIP trunking fulfils those requirements and gives an organization the local presence credibility.
RPA is still in its nascent stage as far as the Indian market goes. Considering the small base, it is not surprising to see CAGR in the mid-twenties. What will be interesting to see is how various industries adopt RPA and make compelling use cases.
Current adoption is led by BFSI, ITeS and, to some extent, healthcare. Onboarding new clients, loan and credit card processing, account reconciliation, and accounts payable and receivable are some initial areas where RPA has found its footing in the BFSI sector.
RPA can be further broken down into two segments, i.e. attended automation and unattended automation. When work is done in conjunction with people, that process is known as attended RPA, whereas if the software manages the entire end-to-end process, that process is known as unattended RPA.
Currently, the attended market has an edge with 60% of revenue generated from it, while the remaining 40% comes from unattended.
Likewise, the RPA market has one more market break-up, i.e. RPA software and RPA services. Here the services market leads the software with a significant gap (70:30).
Automation is an integral part of many digital transformation initiatives. Cost containment, compliance, error management, and expediting service delivery are some key objectives achieved through RPA. These are business vectors strongly aligned to the broad concept of process automation in an enterprise.
Another reason for the widespread adoption of RPA is that implementation doesn‘t require a change in business process, nor is there a heavy reliance on IT teams. Simple RPA projects don‘t need much IT help. In a world of self-service where everything is being rolled out as a service, it will be interesting to see how RPA as a service gets acceptance in India and how true customization, if offered to solve some real-time problems or business challenges of enterprises.
The toll-free market has witnessed a steady decline over the last few years. The rate of deceleration further increased during the previous two years. While the market will continue to reduce, hopefully, it will not erode at the same pace as it did in the last two years.
Toll-free services have somewhere lost their importance due to two significant reasons. First, with mobile calls being a free service, the novelty of toll-free becomes redundant.
Secondly, and more importantly, the advent of cloud-based call management systems, which provide toll-free services along with other features like call tracking, routing, sticky agents, and CRM integration, makes them a much more compelling proposition for enterprises.
The toll-free market is already saturated, paving the way for cloud-based communication channels. Standard offerings like 1800 services are being replaced by cloud contact centers providing virtual phone systems to agents/users.
The only segment which will continue using these services will be some of the large enterprises that have invested in these services for years. However, there is no uptake as far as MoUs go.
Hyperconverged Infrastructure (HCI) is an integral component of the data center modernization process. But for a cost-conscious country like ours, no digital transformation initiative is given a green signal unless it is attached or driven by a business imperative (quite rightfully so).
While data center modernization is a drive that many organizations are undertaking, it is crucial to first identify the key business objectives it will fulfil and if there is a way to objectively create KRAs around that. As DC modernization is not a solution but an approach, it is vital to do an audit to understand where one stands in terms of IT infrastructure and accordingly map your requirements for DC modernization.
Since HCI, at its bare minimum, consists of compute, storage, network fabric and a hypervisor, it is vital to assess each of these individual parameters in terms of their current state and future readiness.
However, like any major transformation initiative, taking feedback and requirement gathering from all concerned departments that may have nothing to do with technology is very important. This way, the IT team can assess some of the real-time challenges an enterprise faces and how modernizing its Data Center can mitigate them.
When it comes to actual deployment, considering that almost all enterprises prefer to have a hybrid IT stance, equal importance must be given to each component of that stance, i.e. on-premise, colocation, private cloud or public cloud all need to work in tandem in the most cost-effective manner (with security built-in as a critical feature). Most hybrid IT projects fail due to a lack of planning when it comes to migration strategy. The migration that is lift and shift of data does not deliver the ideal benefits of software-defined infrastructure. A Data Center built for the future should first and foremost be flexible to accommodate changes that are required in times to come. Also, how you prioritize your workloads and transition this change is critical to the overall success of your modernization strategy.
Barring the occasional video call to connect with our family and friends who stay abroad, most of us never used video conferencing to communicate daily. And it changed with the lockdown. Suddenly companies like Zoom and MS Teams became the thread which connected us all. And the same applies to enterprise communication. The de facto standard was VC, whether in internal meetings or client/partner engagement.
Does it make you wonder why? These apps existed much before the lockdown, but very few companies used them to the extent it is being used now. We feel there are primary reasons for that. One is transactional, the other more humane.
Taking the first point, these VC solutions connect people and help them in collaboration. And there is a difference between the two. Collaboration is a layer that sits on top of communication, where you get to communicate and get real work done, somewhat like how you would do in a physical setup. I think that was a big differentiator.
And about the human element, consider this. Most of us were confined to our homes with limited to no interactions with any human being. We needed that human connection to reaffirm that we had each other. And this transcended personal and professional boundaries.
The interesting bit is that hybrid work seems to be the new normal, with many going back to our offices. And to facilitate it, video-based conferencing is the lifeline.
But, what needs to be seen is how can enterprises and OEMs together take these platforms to the next level. In the post-COVID-19 phase of remote working, organizations wishing to drive sustainable competitive advantage will have to resolve challenges across both technologies and people. And to do so, they need to figure out how they leverage competitive advantage within remote working. Till now, these apps have been used as business continuity tools enabling companies to mitigate Covid 19.
To stay relevant, this sector has to constantly evolve, go beyond how it is used today, and come out with innovative use cases. It will be an exciting phase and differentiate the big boys (in what they have to offer from a technology standpoint) from the also-rans.
Over the last decade, cloud computing has found mass adoption cutting across sectors and market segments. A trend that has been witnessed amongst startups and smaller organizations is choosing to host everything on the cloud, i.e. the industry refers to them as cloud natives or born in cloud companies.
However, for the rest (which includes large and mid-sized), they prefer to have a more balanced view and choose the hybrid model. Considering most of these organizations have some legacy infrastructure or applications, hybrid cloud becomes an obvious choice.
However, since hybrid cloud is a combination of your on-premise Data Center with a Public Cloud environment (or multiple cloud environments), it becomes imperative that the transition from one environment to another is seamless. But that is only possible if your Data Center or the applications that run on it are as nimble and agile as their cloud counterparts.
The challenge is that most legacy infrastructure and applications were not built for a cloud-enabled world. They are usually monolithic, and hence there is an experience gap between an on-premise world vis a vis cloud ecosphere.
This is where the concept of Data Center Modernization kicks in. The idea of modernizing data centers hinges on the core concepts of consolidation, maximizing utilization and cost optimization. All this is possible when one considers the convergence of data center operations. That’s the prima facie use case of Hyper-Converged Infrastructure (HCI).
In other words, HCI can be seen as an extension of how virtualization transformed the Data Center a decade back. In most cases, enterprises virtualized their compute and found tremendous value. HCI extends the benefit by rolling in compute, storage, network, and a hypervisor into a software-defined model that can run on any commodity hardware.
HCI has and will continue to play a pivotal role in the future in realizing the true potential and benefit of hybrid cloud.
Over the last few years, the use of SMS for personal communication has considerably reduced, with most users preferring OTT messaging platforms like WhatsApp and Telegram. In fact, the consumer SMS market is a decelerating market.
But this does not mean SMS is dead. On the contrary, enterprise SMS, or A2P in particular (Application to Person), has a stellar story and will grow almost in double digits in the next few years.
All the notifications you receive, like OTP, bank statements etc., are examples of A2P SMS. This market had a shot in the arm during the pandemic. There was a surge in digital transactions; many government alerts were being sent and other app notifications.
But the key driver for this sector going forward will be IoT. When we speak about IoT, there are various ways devices can communicate. To put things in context, we can consider IoT bifurcated into cellular and non-cellular IoT. For cellular-based IoT, all notifications run through SMS, making it a huge opportunity, considering the M2M SIM market is growing reasonably rapidly (in its mid 20’s).
Emerging verticals such as healthcare would also help propel the growth story of A2P SMS, with automation being the critical business lever.
Also, for a country like ours, where mobile penetration is very high and for a segment which still relies on feature phones, A2P SMS remains one of the most cost-effective and yet secure ways to reach out to a large cross-section which resides not only in urban metros but also in the hinterland.
An American marketing adage says, "if you treat your employees like gold, they will treat your customers as diamonds". This holds true for any sector but becomes more relevant where human capital is the primary lever of success.
IteS has more than 3.5 million direct employees, making it one of the biggest industries for a skilled workforce. The last two years saw many "firsts" for this sector, as remote work as a concept was introduced to keep business continuity. For an industry which is highly regulated and process-driven, this change was not an easy one to manage.
Also, last year we saw a wave of mass resignations. Although a trend witnessed more in the IT sector, the ITeS sector has always struggled with attrition given the employee demographics. Along with compensation, today's employees want choices: freedom to work from anywhere, higher productivity through first call resolutions, and advanced technologies like AI and Analytics to expedite regular queries.
Given this shift, Cloud Contact Center, in parts, has answers to many of these requirements. Attributes like the ease of deployment, management, and scale make it ideal for organizations of today, especially those who don't have lots of legacy infrastructure. A reason why we see the Cloud Contact Center market growing at a healthy rate of 12-14% CAGR.
However, considering many established players have been in business for more than two decades, the concept of rip and replace might not find many takers. They are coupled with challenges around data sovereignty, concerns around the long-term cost of hosting, and issues related to migration and integration.
We believe the decision depends on which stage and scale an organization operates to a certain extent. While large entities will have a measured view, SMBs or new setups will have a higher proclivity toward Cloud Contact Centers. For the time being, some of the larger setups are toggling with on-premise and cloud-based and experimenting with a hybrid approach.
Two areas mentioned in almost all tech discussions are security and data. Whether we are talking about digital infrastructure, cloud, or application modernization, it relates to the data it generates and how to manage it. Until a few years ago, data used to be confined to structured data, or in other words, structured data is what used to get captured. But advances in data science have led to the capture of data in myriad shades.
And with such a massive explosion of data, storing and managing it has become an essential part of information services. Enterprise storage has evolved with time, where organizations not only rely on on-premise storage solutions but have very effectively managed the transition to cloud-based storage services. The challenge is two folds. One, the amount of data to manage is increasing exponentially, but also equally important is the type of data that is getting stored, as much of it is unstructured.
The type of storage depends on various parameters like frequency of usage (hot and cold), archival etc. For organizations, where the value of data is attached to the organization's equity, it becomes a zero-tolerance space. However, while managing such complex environments, specific challenges are bound to come up, like security threats, accessibility issues, etc. And it is essential to note that all challenges are not necessarily infrastructure-led; they can also crop up due to human intervention.
While managing data storage, an enterprise might want to consider:
Terms like cloud-native and born in the cloud are part of the tech industry lexicon. But ever wondered what the role of applications in all of this is. Where we keep our infra (public cloud, private cloud/on-premise) is primarily dictated by the applications that reside in it.
For most mid to large organizations that have been in operation for a while, it is quite natural for many of their applications to be monolithic (traditional way of building applications). Yes, in today’s context, there is a new breed of applications developed on the concept of microservices (it is a way of application development where the process of development is managed through several inter-related services).
There are several advantages in working on microservices architecture, especially if your applications need scalability and are purpose-built for the cloud. Many organizations who have tried to modernize their legacy applications have done so through containers, whether through “lift and shift” existing applications or refactoring them.
However, just because monolithic is a legacy system does not make it bad. There are several advantages to having a monolithic system of designing applications (which is why they existed in the first place). The catchphrase that can be associated with monolithic architecture is simplicity. It is simple to develop, deploy, debug, and test.
But unfortunately, being simple is not enough in most cases. Since monolithic (as the name suggests) has a single code base with multiple modules, the moment an application starts getting complex and big, the development time is usually relatively high.
In effect, monolithic architecture is suitable for small applications. But the moment these applications need scale and their complexities rise, it is best to consider alternative development routes, i.e. with microservices architecture leading the way.
In a business where time to market matters, where application development is viewed as an integrated process, where scalability of applications is directly proportionate to the success of your business venture, microservices in large parts answers most of these requirements. Service-oriented architecture came into inception in 1998. It has grown over the years and is now redefining itself in the form of microservices.
The enterprise network has expanded and how. From the confines of the data center to the distributed environment of the public cloud; to the edge of remote locations. As a result, the posture of securing the perimeter (mostly Datacenter) has become redundant. Thrown into the mix is this entirely new paradigm of hybrid work.
Given all these changes, the way we look at network security also needs an update. Concepts like SASE (secure access secure edge) and Zero Trust have become the new norm.
Zero Trust is a paradigm shift in how one approaches Infosec. The key tenet of Zero Trust is no one should be assumed safe, regardless of whether they are outside the system on inside. It is a fundamental departure from how perimeter security. The foundation of this paradigm hinges on solid Identity and Access Management. IAM is not a new concept; it has been an integral part of cybersecurity. But integrating the postulates of Zero Trust makes IAM more comprehensive. But it should be noted that Zero Trust is essentially a set of guidelines or a framework and not a solution like how some might want you to believe.
Considering that the framework consists of continuous monitoring, which might include multi-factor authentication, it does not necessarily have to make the end-user experience tedious. On the contrary, solutions such as single sign-on (SSO) help mitigate password mismanagement.
While Zero Trust has found acceptance conceptually, there have been incidents of implementation challenges.
Considering the advantages, Zero Trust can be adopted by organizations, but it is not necessary to abandon your legacy systems. Ideally, enterprises should try integrating their existing systems with the zero-trust framework. And every successful deployment begins in small batches, and post their success is taken enterprise-wide. Zero Trust also will be adopted similarly.
Security challenges in Hybrid Work
Security challenges in Hybrid Work (Infographic)
The concept of hybrid work is pandemic induced. Before the pandemic, most of the workforce used to go to the office, whereas during the pandemic, the same number (over 95%, give or take) worked out of their homes or remote locations. But now, as we emerge out of that period comes a situation where we see a balance of people working in-office while the rest is working out of their homes. This has never happened in the past, and precisely so, the challenges that come with it are also unprecedented.
New and emerging ways of securing the enterprise have come up with data, applications or devices not confined to the data center. They had to since the perimeter expanded. Two frameworks which have created a lot of buzz are SASE and Zero Trust.
With the cloud and the internet being the bedrock of all that is happening, the concept of network security has to transform. But a more considerable challenge is identifying the best solution portfolio (or security framework) that fits your enterprise's needs. It's a conundrum created because of the plethora of offerings in the market.
When we speak about SASE and Zero Trust, there are questions on which of the two frameworks better suits an organization's requirements. We think a better approach is to see them working in tandem and not against each other. SASE brings together cloud-based security aspects like CASB and SWG with WAN capabilities. Zero Trust is an integral part of this framework, where its chief purpose is to manage access control.
Given SASE is a relatively new framework, and vendors and end-users alike are trying to see the best fitment in their scheme of things, given the de facto position of the market, we see it gaining mind share in the future.
Since data is the new currency in the world of business, one can conclude that the more the data, the richer the organization. While true in parts, to achieve this, it is vital to have a tight leash on your enterprise data management strategies.
A big challenge that many organizations face is around data silos. As an organization expands, issues around data silos start surfacing in most cases. An expansion could be in geography, departmental or business line.
The challenge with data silos is that it is hard to identify an issue, given the very nature of the problem. The telltales are when you cannot accomplish as much as you set out to, but then again, it is hard to objectively quantify these expectations.
But instead of just focusing on the challenges that arise due to data silos, it is essential to understand why they originate in the first place. In a world which is rampant with security and privacy threats, everyone does their best to keep their data confidential. As a result, they isolate the data from the rest of the world. If not properly managed, the outcome of this can take the form of data silos.
Breaking the Silos: In most cases, silos are created unintentionally. Breaking them can be a combination of simple people-driven processes to more complex steps as part of systematic enterprise data management.
The last two years have been tough on several industries. Cloud service providers, on the contrary, had a dream run. While cloud growth across the world, especially in India, was clocking healthy double-digit growth rates, the pandemic worked as a catalyst pushing the boundaries of growth.
Public cloud services are internet-based services, where provisioning happens remotely was a key reason why several companies considered deploying their workloads onto the cloud. While this could be the reason for entry, many organizations which have taken the initial steps are exploring how to make cloud an integral part of the IT strategy.
While the growth rates for all the three, i.e. SaaS, PaaS and IaaS, have been very robust, IaaS has the edge over the rest in terms of CAGR, while SaaS leads in size.
We highlight some of the broad reasons for growth for each.
SaaS
PaaS
IaaS
The world of enterprise technology has witnessed massive changes in the last decade. We saw the emergence of virtualization, cloud computing, big data, blockchain, and IoT, among other things. No singular decade would have witnessed so many things happening simultaneously in modern history (at least in the realms of the enterprise world).
However, so much disruption can also lead to challenges. Security is one area that finds a mention regardless of which technology one is speaking about. Security is the horizontal without which any technology discussion is incomplete.
Thanks to all the new and emerging technologies, there have been significant advancements in the enterprise space. Still, security concerns have also kept up with the frantic pace of tech development. Thanks to digital transformation and the internet, most organizations have a very high dependency on being connected with the outside world, which is excellent for business and exposes them to several security vulnerabilities. Cybersecurity attacks can undo a lot of great work that a company has done over the years.
To manage and counter such attacks, many solutions and products are available in the market. In fact, there are so many of them that the industry is facing an issue called the "problem of plenty".
In a typical enterprise, there are usually two significant challenges. First, being on top of the pile of all the available solutions in the market, understanding what their issues are and correspondingly choosing the right solutions. And the second is to nurture and retain IT talent, which will do it for a CIO or a CISO. They are a handful, leading many IT leaders to Managed Security Service partners.
Some of the benefits include the availability of leading tech experts and solutions and predictable billing hence the potential for long term cost savings. Not to forget the round the clock monitoring of organizations' IT assets (including data, applications, appliances, endpoints, networks etc.) which is reasonably daunting if managed in-house.
Once an organization decides to go for MSS, there is one more consideration. Should they go for a fully managed MSS or a co-managed MSS? Organizations with smaller IT teams are suggested they go for a fully managed model. In contrast, large organizations that have invested in human capital and built teams over the years can go in for the co-managed model.
Then comes the decision of choosing the right vendor. Again, one needs to make essential judgment calls—more about that for another analysis of the day.
Few technologies have made an impact as powerful as IoT. From consumer devices to enterprise solutions, the opportunities are limitless. Connected devices offer a range of benefits, including cost reduction, increasing efficiency and productivity, enhanced customer experience, mobility and agility, and better use of resources and assets, among many other things.
But the most significant benefit of the IoT ecosystem is the massive amount of data generated by these IoT devices. What you have is a goldmine of information that, if leveraged correctly, can unleash business opportunities and growth levers for an organization. This is where analytics comes in. Data analytics has a significant role in the future success of IoT applications by securing high RoI.
One could consider the insights that get churned from these IoT devices as the fuel which propels the entire IoT ecosystem (also providing opportunities for managed services model). The actual value of a successful IoT deployment is when one can derive intelligent insights that can pave the way for new businesses or help extend an organization's service offerings.
Analytics in IoT can be further broken down based on the kind of challenge they address and the insights they generate.
Predictive: In some ways, this could be considered a subset of real-time analytics; machine learning capabilities are incorporated to assess the likelihood of a future event happening. This is possible given the vast amount of data generated by IoT devices that work as historical data or reference.
Descriptive: This kind of analytics is primarily used for monitoring the performance of devices and helps in finding anomalies and identifying trends or patterns in usage.
Realtime: This deals with live data originating from multiple IoT resources. The analysis is done in real-time, and action is taken based on that analysis.
Each organization has its reasons to move to the cloud. Some do it for cost optimizations, while others scale operations.
The cloud journey means different things to different organizations. There are some whose foundation is on the cloud; these organizations are also known as cloud natives or born in cloud companies. Since they are built ground up on the cloud, they don’t have to deal with legacy systems or infrastructures.
But for the rest of the organizations who have been in business for a certain period are bound to have some if not the majority of their workloads on-premise. This is more true for large organizations that have been in business for long.
Not only do they have essential workloads in their data center, in many cases, the applications are monolithic. So what one has is a system built on legacy infrastructure and monolithic workloads. This is not necessarily bad because these IT infrastructures were custom-built and have stood the test of time.
When a company adopts cloud, it does so purely for tangible business benefits. And many organizations (including large ones) have come to realize there are instances when you are left with no choice but to move to the cloud, purely because of what the cloud offers in the form of scale, reduced lead time etc.
Once an enterprise decides on embarking the cloud journey, comes a critical element of this journey, i.e. migrating the chosen workloads to the assigned cloud platform. One thing is sure: the migration process is no walk in the park. More often than not, a cloud project fails primarily because of issues in migrating the workloads. So much so that it defeats the entire purpose of going to the cloud.
It is vital to have a clearly defined cloud strategy and not take it as an ad-hoc project. A good business strategy has various elements, which are first defined and then a team is entrusted with implementing it. A cloud rollout should also follow a similar structure.
One of the fundamental tenets of the cloud is to simplify managing IT infrastructure. But as more workloads go to the cloud, issues around vendor lock-in and lack of visibility of multiple cloud environments are surfacing. While there are no easy answers to this, but basis our understanding, it is best advised to spread your cloud portfolio across multiple cloud service providers while maintaining a hybrid approach. Also, it is imperative that due importance is given to regular training on how to use and manage cloud environments. Offloading workloads to the cloud helps increase the bandwidth for internal IT teams, but it is essential to acknowledge that a successful deployment is not the sole responsibility of the cloud service provider; it is an equal partnership.
In each phase of tech innovation, certain technologies work in tandem to create synergies. There was a time when SMAC (social, mobile, analytics, and cloud) saw a convergence of these technologies, driving business innovation.
We see similar convergence in IoT, 5G, and edge computing coming together. In many ways, the success of each of these technologies is based on how successfully enterprises can leverage and understand the interplay that each of these technologies have with each other. And as a result, create the relevant narratives which business leaders understand and cater to their specific industry requirements.
While we have heard of instances of managed IoT services, to truly be in control of these converging technologies, we feel the role of managed services becomes very critical where the service offering covers the entire gamut, i.e. the network infrastructure (5G), edge computing and IoT. If this can be provided in the form of a platform, we feel it will find a lot of resonance amongst enterprises.
While it is an even playing field for hyperscalers, telcos, MSPs, or CDN providers, considering the network advantage, telcos should use it to their position of strength and create a detailed service portfolio. This would be a significant step since it requires considerable investment and expertise in making such a service mesh. But this will take them beyond their traditional product pitches and confirm their cred as an end-to-end ICT solution provider.
And as data indicates, revenue realization is not the key driver for the uptake of these emerging technologies. Companies realize that if the focus is on the fundamental tenets of business that have evolved (safety, efficiency, innovation), monetization will be a by-product of the success derived from these transformational changes.
It has been 27 years since the first mobile call was made in India. Lot has changed since then, making India one of the biggest telecom markets in the world. We are genuinely a mobile-first country. So far, we have had four generations of wireless services, with the fifth one to be launched soon.
Mobile wireless has been a consumer-driven business, with more than 90% of revenue coming from there. But that will change with 5G. 5G brings a massive leap in internet speed, reduced latency and increased capacity. Altogether, driving improved reliability. And it is hopefully backed by enterprise-grade SLA, which we feel will be a big differentiator.
Not to suggest that 5G won’t have creditworthy use cases in the consumer business; of course, it will be with so much to expect in entertainment, communication, gaming, and connected devices. But the significant impact is likely to be in the enterprise space as new and emerging use cases come up in times.
Also, with the advent of 5G, we hope to see the concept of private 5G come up. As a concept, it has existed in the market with private LTE but did not have an uptake as expected. With 5G finding more relevance, especially in IoT and edge networks, it would be interesting to see how this space grows.
Like in any tech adoption, issues regarding security will always be an important area of consideration. Since 5G is the latest offering, from a carrier perspective, they will make they will have a zero-trust framework. However, for businesses that build network services on top of 5G services, understanding these new security models will be critical to ensure they make comprehensive, secure services and follow the norms governing the industry.
Also, there is a challenge with data proximity. Dense urban areas will have better coverage, but we go from the infrastructure further, and the shortcomings become bigger.
But, clearly, the benefits far outweigh the concerns, and it is only a matter of time to see how all of this gets rolled out and finds its champion use cases, hopefully, some of which are genuinely local and pertinent to the Indian market.
Whenever we speak about technology adoption in India, there is always a mention of how cost-conscious we are as a nation. However, when it comes to critical drivers for the adoption of SD-WAN in India, areas such as reduction in complexity of hybrid infrastructure, management of hybrid networks, and faster deployment of new locations feature high on the CIOs priority matrix.
While we remain a cost-conscious nation (and for reasonable measures), networking cost does not feature as a top draw because the SD-WAN value proposition in India does not offer the significant network cost saving as is the case in the western world (due to significantly less price differential in MPLS and Internet Leased Line).
Moreover, traditionally enterprises in India have been reluctant to replace their MPLS links with broadband due to unreliability and security issues. But that is expected to change in the near term with improvement in fixed broadband quality and integration of SLAs. This can lead to the adoption of SD-WAN among Indian enterprises.
It will also be interesting to see the role of 5G in this regard. Given its promise of low latency and robustness of network quality, it could be used for last-mile connectivity, fulfilling requirements for branch networking. With a combination of SD-WAN intelligence with the reach and flexibility of 5G, service providers can usher in a new breed of offering into the market.
In the future, hybrid networking will be the go-to model for enterprises under-SD-WAN environment, with MPLS, ILL, and broadband (and perhaps 5G) playing key roles.
The office isn’t dead.
While hybrid work is the future of work, we mustn’t undermine the value of an office setup. And for hybrid work to be successful, the advantages of working from a remote location and an office must feed off each other. In other words, while there are many advantages of working from anywhere, to take it a notch above, the perceived benefits of an office (both from an employee or an employer perspective) somehow need to be mapped into a remote setup. And these are not a few steps that an organization can take; instead, it is an organizational transformation process that we are talking about.
We believe, like any living organism, every organization has its DNA, making it unique. Every forward-looking organization strives to have a set of company values and ethos which helps them identify who they are.
When we speak about workspace modernization, various technological products or solutions come together and help organizations achieve their objective. VDI, UC, and Unified Endpoint Management (UEM) with a Zero Trust Security framework, to name a few.
These solutions have truly helped cross the divide. Still, true success would eventually lie when the employee experience is as seamless as what they would experience in an office environment and where an organization can foster a culture that it envisages despite location differences.
To conclude, while devising a hybrid work strategy, enterprises need to consider the critical purposes physical offices fulfil for organizations/employees and ensure that the shift in the working model does not compromise these purposes/objectives.
Most of our DCs are currently concentrated in major metros like Mumbai, Delhi NCR, Bangalore & Chennai. Cities like Mumbai and Chennai have been traditional choices since most subsea cables have their landing stations in these cities. However, Delhi NCR has recently picked up the pace, keeping in mind the constant rise in demand for colocation services.
While the current batch of DCs have a considerable footprint (both in size and power) as they cater to hyperscaler requirements that need that kind of size, the future will see a healthy mix of edge data centers. In our view, there are two distinct reasons for this proliferation.
First, edge data centers will be an essential step toward the path of digital transformation in the rural sector along with Tier 2 &3 towns. These edge DCs will provide the necessary infrastructure to expedite many of the Government of India's technology initiatives. Edge-based applications that support financial inclusion and literacy are just some of the use cases that can be created as a direct outcome of setting up these sites. It will also allow the local professionally skilled workforce to participate in this transformation.
Second, when we speak about emerging technologies, one cannot have a conversation without mentioning 5G, IoT, and AR/VR, amongst other things. The need for ultra-low latency, high bandwidth, and computing resources near the perimeter binds all of them. While technologies like AR/VR have existed for a couple of years, adoption at scale would need edge DCs.
Speaking about AR/VR, the most important use case is in the form of the Metaverse. While there is a lot of confusion (we are still trying to make meaning of it) about what Metaverse is, we are confident of 3 key things which will make it a "reality":
Essentially, the Metaverse will require constant, instantaneous, high bandwidth data transfers that are not possible without edge computing. High-capacity IT infrastructure would be necessary to support the idea of the Metaverse, making edge data centers more crucial than ever.
Our view is that while the hyperscalers will continue to dominate the big metros, they will be a balance in the form of edge data centers spread across the nation, making it a more stable platform for nationwide digital growth.
With time we have witnessed a steady trend of many organizations moving their DCs from on-site to a co-located space to reduce capital expenditure, achieve higher operational efficiencies, and leverage state-of-the-art infrastructures of colocation facilities.
More than half of the organizations participating in this research study have indicated that they intend to use colocation services for all or majority of their DC requirements.
As mentioned above, one of the significant advantages of colocation services is the reduction of capital expenditure. Colocation by its virtue converts this capital expenditure to an opex-based model. However, the expenditure model has shifted from a capex-based to an opex-based model and does not guarantee savings in the long run.
The listed price listed on your contract may not include some of the value-added services and support-level expectations which are mission-critical for your organization.
We must be aware of the services/solutions offered by these colocation players to maximize the value out of the deals we are getting into. By no means exhaustive, but highlighting some of the areas, you might want to consider during your next round of negotiations on colocations facilities.
Data backup: Business continuity and Disaster Recovery are strong use cases but are not usually included under basic colocation fees. While you can lease additional space to implement a backup service, managing it can be complex. Some of the colo players provide backup services as a managed service.
Datacenter Interconnect: Internet connectivity and bandwidth typically aren't included in colocation fees. Moreover, technologies such as Datacenter Interconnect, where two or more DCs are connected over short, medium, or long distances using high-speed packet-optical connectivity, are also not part of the basic pricing. From BC/DR to helping enterprises scale their infrastructure, use cases of DCI have grown in recent years.
Service Support: Since colocation services are hosted from remote locations, timely support becomes a critical element. It's essential to have a detailed discussion about all the provider's support costs and response times and what features are included in the costing.
When we speak about the 5G rollout, most discussions hover around the various use cases in the enterprise segment, covering a large cross-section of industries, including manufacturing, energy and utility, healthcare, and logistics.
One of the critical differentiators for enterprise service offerings is stringent SLAs that service providers must comply with. Given the hype around 5G, a lot is riding on key success stories emerging early in some verticals. This is a big step up from a consumer-driven world that does not usually have to conform to such stringent guidelines, making the margin of error low as far as the enterprise market goes.
Since many mission-critical applications and business functions will rely on uninterrupted 5 G-based network services, a few things need to be kept in mind.
Also, issues around proximity and rural rollouts will need to be managed long-term. Understandably, most rollouts will centre around large urban metros, ensuring dense urban areas get better coverage during the initial phase. But the issue of proximity still might be a concern since the high band spectrum results in reduced coverage. For India, the sweet spot is the mid-band spectrum since it provides coverage and capacity.
Also, since much of the 5G deployments will happen over the 4G spectrum, one way to manage bandwidth limitations will be through network slicing, which takes a page off-network virtualization.
Over the past two years, organizations have turned to a wide array of solutions to help power remote work. One of them has been VDI. While there are many benefits clearly, in the past organizations have faced challenges.
Like many enterprise applications, moving from a CAPEX based on-premise model to a cloud hosted model (Desktop as a Service) seems inevitable.
But this transition will be successful only if it overcomes the challenges that are usually faced by organizations managing VDI on their own. One of the biggest issues around VDI has been around manageability and handling the complexity of the solution. DaaS seems to cover these aspects since it is offered as a managed service by cloud vendors. In DaaS, 3rd party service provider offers end-to-end managed services relieving enterprises from a lot of management complexities. Service provider configures the virtual infra depending upon the requirements of the end-user.
As far as cost goes, if an organization has predictable growth requirements, then upfront investments can be cheaper in the form of VDI deployment. However, since DaaS requires no CapEx, the model makes it easier to dynamically scale operations. Hence, companies with fluctuating requirements or those anticipating sharp growth can opt for DaaS.
As a solution VDI is well positioned to be an important element as far as workplace of the future goes. Please bear in mind VDI is a software while DaaS is a service. While DaaS is a growing market (VDI has comparatively tapered growth rates) each model is more appropriate for some use cases than the other. How it gets used (VDI or DaaS) depends entirely on the needs, scale and stage of the organization.
Like the phoenix, VDI continues its magical run with the hope that it finds its rightful place in the enterprise tech mainstream. No product has reinvented itself like the way VDI has.
VDI has two avatars, one pre-pandemic and the other one includes during and after.
Unlike server virtualization, VDI never found so many takers. And the reasons were many. Server virtualization was easy to manage, savings were enormous, and CAPEX savings were easy to show, among other benefits. Whereas in VDI, the solution was complex most of the time, and monetary savings were not a benefit. Use case of VDI in the initial phase was confined to a few industries, i.e. call centers or the education sector.
But with pandemic came the concept of remote work, and VDI found its true calling. VDI enabled remote work, and the onset of the pandemic further bolstered its use for remote work operations and management. Employees working from home or field technicians needed a reliable way to access their applications and tools. VDI or DaaS (cloud-hosted virtual desktops) allowed users to work from anywhere as long as they had access to the Internet.
The concept of remote work also has come a long way. A decade back, it mainly centered on cost savings, but now we are seeking competitive advantage within remote working; deployment at scale across key industry verticals, extending into next-generation technologies.
Some of the distinct advantages of VDI to enable the vision of future work are:
SD-WAN is the new poster boy in the enterprise networking space. In India, the SD-WAN market is primarily driven by large enterprises due to the growing business and operational needs caused by the widespread adoption of cloud (SaaS, PaaS, IaaS), mobility, and other digital solution in multiple branches. As far as the SMBs are concerned, it is primarily a wait and watch, with a rise in expected uptake if fixed broadband becomes a mainstay in this market segment.
This leaves us with the mid-market. Sandwiched between the two but makes an excellent case for managed SD-WAN offering since large enterprises are reasonably equipped to manage end-to-end deployments. Based on our interactions and market understanding, even a co-managed deployment model finds takers amongst large establishments.
Some of the critical reasons for organizations going for managed services are:
Also, lets us not forget an essential element of any business transaction these days, i.e. SLAs. How service providers give assurance for SD-WAN services will be crucial for enterprises going for managed SD-WAN services and eventually identifying the partner of choice to offer these services.
While choosing the right partner for your managed SD-WAN is pertinent, an organization must identify the right underlay service provider which is best suited to meet an organization's digital infrastructure requirements.
Essentially, all three types of deployment models (Fully Managed, DIY & Co-Managed) will continue to co-exist. Each model has the desired benefits specific to a particular customer segment.
SD-WAN has been in the news for a while, with a few industries edging others by taking the leap of faith in deploying SD-WAN across several sites.
While the market looks very optimistic, the reasons for adopting in India vary compared to the rest of the world. SD-WAN is championed because of its cost competitiveness as a replacement for more expensive MPLS links. While that works as a strong reason for adoption globally, it is not necessarily so in India. The cost differential is not that significant in India. The critical reasons for adoption stem from the technological benefits that an overlay technology like SD-WAN provides, i.e. seamless management of networks and faster deployment of new sites.
Driving the wave of adoption are two industries which have always led from the front as far as tech innovation goes. In the case of SD-WAN, it fits the nature of the business since both operate in the hub and spoke model. The two industries are banking and Retail (e-commerce in particular).
With its focus on service innovation, banking has been known to lead from the front. Retail banking, in particular, has changed a lot over the last decade. New business models have evolved where online banking, mobile banking, and electronic payments are growing at a pace driven by consumers. As a result, the sector is relooking at the local branch in a whole new way.
While the customer is the king in every industry, it finds a whole new meaning in the retail sector. They are always trying to find an answer to and redo the one question: “How can we improve the customer experience?” Retail is no longer constrained by either time or geography. In a low margin industry, agility and flexibility can be solved by adopting digital infrastructure initiatives, which helps them scale and reach the last mile in the most effective manner possible.
At Think Teal, we have devised a four-pronged approach for an organization to achieve its Hybrid Workspace objectives. The four dimensions are
In a workplace, two vectors are at play, i.e. the employer and the employee. One can consider true success in the workplace when both parties consider themselves in a position of advantage.
Usually, the focus is on security aspects (and rightfully so) while ensuring work flexibility. But these past two years have taught us many things, and one of the biggest learnings is not to make the mistake of taking your employees for granted. For instance, last year was a watershed year in terms of mass resignations. Retaining talent suddenly became a strategic business initiative for many organizations.
According to a study by Think Teal, about 42% of employees feel “Frustrated” when they struggle to get access to Company Resources on time. On the other hand, 1 out of 3 organizations acknowledges that a dip in employee productivity leads to “employee attritions”.
To make employees feel inclusive irrespective of whether they are working from home or office, enterprises should emphasize providing a consistent experience, be it easy access to company resources or application availability to employees for better productivity. Maintaining application availability and performance and providing easy access to these applications for employees working from anywhere is critical for businesses. Therefore, organizations today are looking toward solutions like Application Delivery Controllers (ADCs) to ensure smooth access and availability.
Cloud Computing, IoT, Big Data, Analytics – These are some of the most trending terms in the Enterprise Technology space. While they are unique areas, their interdisciplinary nature makes them tick all the necessary boxes of today’s enterprise requirements.
Big Data is a relatively new concept with its share of business benefits and challenges. There is no denying this is an expensive area to invest in, and the returns can take time. Not to mention all the other types of challenges that have been mentioned above.
While many large organizations might have the wherewithal, it is never an easy task to undertake. There are so many moving parts in a Big Data exercise. In most cases, the biggest hurdle lies in the quality of the data. Studies indicate that data scientists take as much as 60-70% to curate before the data can be even used and further analyzed. A direct analogy of this could be seen in the world of clinical research, where gestation periods are very long with no guarantees of success. But then, in both the scenarios, the far-reaching benefits have propelled both industries forward, trying to navigate past all the hindrances that come their way. Another issue that deserves attention is the value attached to the data in question. Gaining a view of ROI for such complex projects is not an easy task.
Quality workforce has been an issue. This translates to a big job market for people with the requisite skill set. But a more systematic approach to this is how the entire organization views data. Big Data can’t be within the confines of just a few; it has to be a collective effort. On a short-term basis, one can outsource the function to an analytics company, but the ideal route would be to develop the expertise in-house making it a long-term strategy.
Many people have asked us, i.e. will the exponential growth of cloud services in India hurt the colocation business. The question arises because colocation & cloud may seem like a battle of opposition, but in reality, they go hand and hand. Let us understand how.
Over the last few years, enterprises have chosen a mix of infrastructure solutions that include cloud and colocation deployments. As far as the colocation business goes, it is divided into two segments, hyperscalers and retail. Most Cloud Service Providers (CSPs) come under the hyperscalers category.
In fact, till now, the retail business (enterprise) has been the more significant contributor. But this would change as we go forward with hyperscalers garnering the lion's share. The colocation business growth is strongly linked with how the cloud business performs in India since most of these CSPs are housed in these colocation spaces.
Hence, the demand for colocation services (CAGR in the mid-20s) can be attributed to rapid growth in the hyperscaler market (CAGR in the mid-30s).
The colocation business has all the signs of being a sunrise sector where demand seems to outstrip supply. Therefore, the DC market will continue to attract a lot of investment for new capacity built up from the supply side. We have seen a steady influx of foreign players and investors showing interest in this sector, and with the newly found infrastructure status, things can only become better.
Cloud Communications is a hot area now. From the early 2000s, there was interest in this area, but remote work (primarily due to the global lockdown) pushed the envelope.
Cloud Communications can be classified into three sections, i.e., CPaaS (Communication Platform as a Service), UCaaS (Unified Communication as a Service) and CCaaS (Contact Center as a Service). While UCaaS and CCaaS can be viewed as specialized SaaS for enterprise communication purposes, CPaaS is a platform (PaaS) that helps organizations integrate communication services into their existing applications through APIs.
Of the three, UCaaS is the biggest in revenue, followed by CCaaS and CPaaS.
While CPaaS may be relatively smaller (and developing), it is a promising market. It is a market where telcos and cloud-based service providers are jostling up for a larger market share.
Thanks to the pandemic, both customers and organizations are open to new ways to communicate. Like how new-age contact center solutions offer an omnichannel experience, similarly, CPaaS can connect several communication platforms through APIs.
But the most successful deployments have been on OTT platforms like WhatsApp, where enterprises integrate chatbots to enable customer service. Since the application is embedded in a customer's smartphone, there is already a plethora of information with previous communication history at your disposal. This helps in a quicker resolution to problems and an improved customer experience.
CPaaS can also be used for payments by integrating multi-factor authentication in a client device. Messaging services are the best example of this kind of service (OTP to authenticate user identity). This is also emerging as a critical use case for most organizations with an online presence and significant transactions with end customers.
Cloud Communication is essentially helping tear down the rigid frameworks that were once hallmarks of enterprise communication. In line with the overall theme of the cloud, services like CpaaS are increasing flexibility to understand and address end-user requirements, which has always been the core of most successful business operations.
As bandwidth requirements keep on increasing, the need to improve ways of connecting for an enterprise is on the rise. Besides the already existing types of connectivity that are provided by all telecom operators, one type which is slowly gaining traction is what the industry refers to as dark fibre.
As the name indicates, dark fibre is essentially unlit fibre or fibre through which light waves are not passing through. The reason it exists is that when the fibre is laid down, the bulk of the expense is in civil work and getting the licenses and grants. So service providers always install additional capacity to future proof their networks. Enterprises can lease dark fibre and manage it end to end on their own.
While there are specific merits in using dark fibre as listed above, it is not for all. The two biggest challenges that come to our minds are cost and control. Control can be a double-edged sword. Managing a private network entirely on your own has its fair share of challenges, like integrating with existing infrastructure etc.
Primarily, dark fibre is used by large corporations who have specific network requirements (and can afford it!). Government agencies, financial institutions, and healthcare are some of the industries that are the early adopters, apart from technology giants like hyperscalers who have high captive requirements to fulfil.
According to Think Teal, there are four foundational pillars for successful Hybrid Work Models. They are
All four are equally important and require expertise to manage and maintain. Security is a horizontal without which no enterprise tech discussion is possible. The exhibit indicates that there are different threat vectors when we think about data security. They could be external or internal; they need not always be intentional. However, any lapse in data security has always had repercussions for business, not just financial losses but also brand reputation. And then there are compliance issues to deal with.
According to a recent research report by Ponemon Institute, the average cost of a data breach increased by 2.6% from $ 4.24 Mn in 2021 to $4.35 in 2022. It had risen by 12.7% in the 2020 report, a direct indication of the impact of hybrid work on data breaches.
While there are umpteen solutions available, choosing the right solution becomes a problem in itself!
Some of the trends that are prevalent in the cybersecurity market are:
Over the last few years, we have seen the resurgence of the point-to-point (P2P) connectivity business in India. When we speak about P2P connectivity in enterprise networking terminology, we focus on DLC/IPLC (Domestic/International Private Leased Circuits) or Ethernet Connectivity. The need for such private point-to-point connectivity has risen due to multiple factors, but the primary catalyst for this growth has been the rise of Cloud and Hyperscalers.
Most tech giants require high-speed connectivity with ultra-low latency to connect two or more of their data centers to provide the ever-growing need for a hybrid, multi-cloud IT environment.
This is a complete reversal of fortunes for the P2P market, which saw a steady decline over the last few years before the need for data center interconnect kicked in. The general enterprise trend in network consumption is primarily geared towards MPLS (VPN connectivity) or ILL (internet leased line). Most organizations’ network requirements get fulfilled by either of the two technologies. But in the case of OTTs, Hyerscalers and Cloud Service Providers, since their scale of operations requires direct connectivity between data centers, we see a spike in the uptake of P2P connectivity business.
The rise of data center interconnect can also draw a parallel with the growth of colocation services in India. Many of the data center providers offer DCI services to their customers.
Many of the manual tasks can be automated by integrating industry-standard APIs into DCI solutions, reducing maintenance hassles.
It’s been over three months since we started operating as an organization. Off the many things that you need to do while starting your own company, an important area is to identify the IT requirements of your company. While there are tonnes of information available on what are the “go-to” products and services, it is never really an easy task setting it up on your own.
Technology has a role in many aspects of your business that can get daunting. From the platform you build your website (and host it) to productivity applications, the devices to work on, and connectivity, the list is quite long.
We know there is a concept of CIOs on demand, but then those are best fit for either established organizations or startups that are well funded.
The purpose of technology is to make things happen for an individual or an organization. But it becomes great when it is simple to understand and operate. Smartphones are such a success simply because of their intuitiveness.
We are sure we echo the sentiments of many small organizations who wish enterprise technology was as simple as using a smartphone.
And issues don’t end with deciding what to buy. What about troubleshooting an installed application or infrastructure-related problem? Most of the challenges are trivial but complex enough not to get solved.
As companies grow, they build capabilities along the way. Managing technology is somewhat akin to this philosophy. While the ideal scenario is to hire or outsource to manage your IT function (inorganic approach), the prudent way is to build those capabilities internally. The second approach has its drawbacks, but then again, most small organizations don’t have the luxury of spending on additional people. Also, there is a difference between knowing and doing. Anyone who has got his hands dirty (and perhaps made mistakes in the process) will always have the correct intuition of things, even when a specialist is on-board. Because, after all, no one cares about your company as much as you do!
Many believe that the pandemic years were the catalyst for technology adoption. Technologies like cloud found widespread adoption due to the situation that arose ( work from home). But there was an equal amount of spike in cybercrime across the globe, and India was no exception.
While the world welcomed the concept of Hybrid Work, it also showed how ill-equipped we were to mitigate the risks involved. The reasons could be many, but one stands out: our exposure and view of the perimeter.
Before the lockdown, most organizations spent all their energies on securing the perimeter, which was, more often than not, the corporate data center and associated office branches. Because let’s face it, most of us worked from offices. But all this changed with the pandemic. Suddenly, more than 95% (give or take) of employees were working out of their homes or remote locations. While that was clearly a stop-gap arrangement to keep the lights on, network security was highly vulnerable, as organizations simply weren’t ready for such a scenario.
Public cloud has been one of the key reasons behind the massive uptake of digital transformation projects that we see being undertaken by organizations of all sizes. Still, it is also one of the key vectors of security breaches. The challenge is that attackers can exploit vulnerabilities in any of your cloud providers (and many organizations are taking a multi-cloud approach).
For network security to be relevant, it has to encompass three strategic areas, i.e. cloud, IoT and the edge. The future of networks lies here; hence, fortifying security in these areas will be an area of focus for many organizations in the future.
PaaS is growing at a CAGR of 28% in India as the sector gains momentum. Given India’s massive developer community, it augers well as not only is the rate of growth high but, in time, will also have a measurable volume.
As cloud offerings mature, it is becoming increasingly difficult to segregate between IaaS and PaaS offerings. While the general trend has been to build the foundation on IaaS, we have witnessed a sharp rise in PaaS uptake, especially in the case of large enterprises or those with a relatively advanced technology function.
For any organization on the cloud journey, the ultimate aim is to build applications on the cloud that can be provisioned as SaaS. And that journey begins with infrastructure, which leads to application development and, finally, application hosting.
However, building a cloud-native ecosystem has its challenges and is a fairly complex task. Reducing these complexities is a strong use case for those envisioning an authentic cloud experience.
And in this world of convergence, a new approach to looking at networks, security, and appliances is part of many of the PaaS offerings available in the market.
Initial years saw PaaS being used for web-based application development, but now the breadth of work done is more expansive, covering areas like databases, AI/ML, networks etc.
With most organizations driving a digital transformation agenda, it is opportunities galore for the entire tech fraternity, be it the product specialists, the service providers and finally, the IT community within organizations who drive these initiatives. While each has a role and is reasonably well defined, as we advance, we can expect a surge in in-house development, and organizations having the necessary expertise with PaaS development will have the required edge.
The telecom sector has undergone many waves of transformation over the last three decades, from 1G, where only voice communication was possible, to now, where the possibilities are limitless. And as a result, operators are constantly pushed to balance capacity & coverage with cost.
With 5G, the need to scale out and be flexible is on the rise. As we have shifted from 1G to 5G, the evolution of RAN (Radio Access Network is a vital component of the mobile network, where individual end-user devices like smartphones, IoT devices or Laptops/Desktops are connected to other parts of a network) has also kept pace.
Cloud-RAN or C-RAN is an answer to the ever-growing network needs. As the term suggests, Cloud-RAN is a cloud platform that operators can use to perform compute functions that were otherwise done on purpose-built hardware platforms. That explains the cost advantage associated with C-RAN. It also supports emerging technologies like network slicing, which have the potential to cut down costs significantly without compromising on security or network quality.
In our view, the closest analogy to the path to C-RAN is how Data Center has transformed over the years. From a growth phase to consolidation, to virtualization and eventually, to Cloud, it has come a long way. But it has taken a while. The journey of RAN from its initial days to its current avatar, which has its foundation in Cloud and Open Architecture, is somewhat similar. It is a journey which has taken years and continues to evolve.
To ensure that your Cloud-RAN is optimally utilized,
A lot goes in when you start a company (tell us about it!). Since everything is a new beginning, there are only limited things that a start-up can focus on. And often, security is not a high-priority agenda for most. This is common knowledge and is unfortunately misused by most perpetrators.
As is the case with most security breaches, they happen when you are most susceptible. With so much going on for a start-up, one does not get the time to focus on the security elements. As data suggest, while many security breaches are done on purpose by cyber criminals, there have been several incidents of security lapses purely because of human error.
The biggest asset of a start-up usually is its IP, so it is not necessary that an organization has to be significant in size or revenue to be a potential target of cyber thefts. In the dark web, data is always valued, big or small. Based on industry estimates, 1 out of 3 start-ups has a security breach in the first six months of operations.
Having an online presence has helped many small organizations achieve scale. One can hardly think of a company that does not have an online presence, be its website, social media handles, eCommerce portal or tie-ups with giant business aggregators. But an online presence also makes them vulnerable.
Moreover, while there are a lot of positive outcomes of remote work, the last two years also saw a sharp rise in security incidents, and start-ups were no exception.
As a start-up, most organizations are conservative about spending money on anything not strategically important to the business. But as the case has been laid out, security has to be part of that broader vision, whether we like it or not. Some estimates suggest that a small company should invest at least 5-10% of their IT budget in security.
Today employees within an organization have greater access to business data and other information assets that can boost their productivity and facilitate smooth business operations.
In this process of making access to information easier, enterprises tend to side-line the importance of having a data protection strategy where the IT security measures align with the backup, business continuity and disaster recovery strategy of the organization.
As per Government of India, country reported around 13.91 lac security incidents in 2022. India was one among the top targets of hackers globally and in South Asia.
According to latest findings, the cyber-attacks on government agencies and in the healthcare, sector grew several folds in 2022.
In India, most of the organizations do not have a process to synchronize their IT security capabilities with their BCDR plans. This creates data availability gaps which can prove costly when businesses face any security threats.
For most of the Indian organizations, their IT security and BCDR plans are either loosely tied up with each other or operate in silos. This makes having efficient data protection plans complex. Companies need to put in honest efforts towards making cyber security and BCDR strategy to work in tandem.
Involving IT security teams during Cyber Resiliency planning, improving IT security and BCDR capabilities to suit modern day data protection requirements could be some of the key steps that organizations can look to implement.
Organizations also need to factor in the evolution of cyber security threats and the underlying complexities in dealing with such cyber threats while formulating and implementing their business continuity and disaster recovery planning.
85% Indians ready to spend more for better CX – Growing Role of CCaaS in Delivering Improved Customer Satisfaction
Can 5G be the Next Big Revolution in India?
5G Technology expected to boost India’s GDP by 2% by 2030
Recently, PM Modi awarded 100 5G Use Case Labs to different academic institutes of India to promote innovation and to develop different use cases in the area of 5G. With emergence of new 5G use cases, every sector in India will see some kind of disruption. 5G will not just transform the lives of every individual in the country but has the potential to drive growth in almost every industry in India.
Growing Influence of CPaaS in India
Market is Expected to grow at a CAGR of 20% for next 5 years.
The CPaaS market in India is expected to grow at a CAGR of around 20% for the next five years. The CPaaS market here is experiencing strong consolidation where global players are entering India either by partnering with existing Indian players or through acquisitions. The increasing demand for better and richer customer engagement will push businesses in every industry to embrace CPaaS sooner or later.
Cyber-attacks – Number 1 Cause for Business Disruptions in India
13.91 lac Security Incidents reported in 2022.
In Q1 2023, the country witnessed an average of 2000 cyber-attacks weekly. The widening skill gap, the growing digital transactions continue to pose complex challenges. However, some positive signs have emerged with cyber security now part of National Security Policy the push from government to minimize reliance on foreign companies for its hardware, software and network infra needs.
Is India the next Big hub for growing Global Tech Skill Demand
India – a 2nd largest talent pool for AI/ML and BDA globally.
The growing digitalization is creating huge demand for technology professionals in India. As the AI and data analytics startups continue to thrive, there will be further demand for tech talent. With initiatives like Skill India Mission and government-corporate partnerships, the skill gap challenges, especially in areas like cybersecurity can be addressed and this can make India, a global hub for tech talent.
Tech Agility – Becoming Imperative for Indian Organizations
India – One of the Top 10 AI adopters in the World
The big digital push, conducive business environments, and growing appetite for new technologies are pushing enterprises in India to explore new tech-based solutions. Businesses are realizing that it is essential to have agile tech infrastructure that enhances internal processes and improves overall customer satisfaction.
Transforming Finance: How Fintech is Changing the Digital Game in India
India – 3rd biggest Fintech Ecosystem in the World
The push by the government for the financial inclusion of every citizen is driving banking innovation in India. At the forefront of this innovation are fintech companies that are elevating technology to create new financial products and, at the same time, enhance the overall banking experience of every individual in the country.
Catching Up In the GenAI Race : How Far Is India From Its Own ChatGPT Moment?
India ranks as the world's second-largest digital data producer after China, with terabytes of data generated every month.
For India to fully leverage GenAI's potential, a unified approach focusing on education, skill enhancement, and infrastructure development is essential. The combined efforts of the government, the private sector, and educational bodies are crucial for establishing a robust GenAI ecosystem in the country.
Can Modern Servers Champion the Cause of ESG in India?
India among top 5 countries leading in IT asset Optimization initiatives
In June 2022, Prime Minister Narendra Modi launched Lifestyle for the Environment (LiFE) Movement’.
This movement aims at bringing measurable and scalable behaviour change solutions that can drive climate-friendly behaviors among individuals, communities and organisations.
As India continues to make its mark in the global digital map, government agencies, private entities and environmental bodies are all coming together to make India an epicentre of global ESG movement.
Why India Needs a 'Zero Trust' Cyber Security Posture
India has emerged as the one of the most targeted country by cyber criminals in 2023.
Key Security Challenges:
Takeaway:
As cyber threats evolve, so must India's defenses, ensuring the protection of its digital assets and the security of its citizens in the digital age.
The push towards a Zero Trust security model, backed by global partnerships and advanced technology, is not just a necessity but an opportunity for India to secure its place as a resilient digital powerhouse on the world stage.
Cloud Contact Center – Maximizing Customer Experience, Minimizing Complexities
76% Indians disassociate themselves from a brand if their response rate is low
In recent times, cloud contact center solutions have evolved as new features are being integrated into these offerings.
Integration of features like artificial intelligence (AI) based systems, analytics and automation has helped in businesses exploring new possibilities using cloud contact center solutions.
The regulatory and compliance requirements restrict businesses in completely replacing their on-prem contact centres with cloud versions.
However, businesses are deploying these solutions for non-critical functions where agile customer service becomes a competitive factor.