At present we’re seeing two divergent approaches to the enterprise adoption of AI. Some organizations are racing to implement solutions in an attempt to generate rapid ROI, while others are taking a longer-term view, hoping to reap future rewards based on long term research investments made now.
Regardless of where an organization may be on its AI journey, there are common challenges to be faced, including skills shortages, power usage, supply chain issues and budgetary constraints, with any significant AI implementation starting at £10 million. It’s vital that any organization considering an AI implementation invests in the right resources and technologies from the outset to avoid painful, and expensive, headaches further down the line.
Clear trend of investment in AI
According to figures published by Statistica, an estimated $934.2 billion was invested by companies in AI technologies between 2013 to 2022, rising steadily year on year. The advent of generative AI has further compounded AI spending over the past year, with major tech firms including Microsoft, Google and Amazon leading the way, outpacing investment from Silicon Valley VC firms, according to the Financial Times. In addition, a recently published McKinsey report called 2023 a ‘breakout year’ for generative AI, with a third of respondents surveyed stating their organizations are using the technology regularly in at least one business function.
Despite the clear trend of investment in AI, many organizations are finding large scale AI implementations cost prohibitive at present. In addition to IT infrastructure and people related costs, it’s necessary to also factor-in environmental impact and energy usage. Costs could be a temporary constraint for some, but organizations will need to have a clear path to monetization and ROI of their AI project to justify the spend, purchase the necessary infrastructure and offset carbon emissions to meet regulatory requirements.
Putting the right foundations in place
Regardless of the challenges, the transformative benefits and value of successful AI projects are too great to ignore. Most sectors are still in the early adopter phase of AI implementation, but adoption is only increasing as use cases are defined and we move beyond the conservative thinking that prevails within many organizations. In preparation for this shift, now is the time to start thinking about what is required to ensure solid foundations are in place for an AI-based future.
To enhance the prospects of successful AI implementation, these are the key things that organizations need to be thinking about:
Accessibility of GPUs
Supply chains need to be assessed and factored into any AI project from the outset. Access to GPUs is crucially important as without GPUs, your AI project is not going to succeed. As a result of the huge demand for GPUs and their resulting lack of availability on the open market, some organizations planning AI implementations may need to look to hosting service providers for access to the technology.
Data center power and space capabilities
AI, and its massive datasets, create real challenges for already stretched data centres, particularly in relation to power. Today’s AI implementations can demand power density of 40 to 50 kilowatts per rack – well beyond the capability of many data centres. AI is changing the network and power requirements for data centres. A much higher density of fibers is required, together with greater, higher-speed networking than traditional data center providers can cope with.
Power and space efficient technologies will be crucial to successfully getting your AI project off the ground. Flash based data storage technology can help mitigate this problem, as it is far more power and space efficient than HDD storage and requires less cooling and maintenance than traditional hard drives. Every Watt allocated to storage reduces the number of GPUs that can be powered in the AI cluster.
Data challenges
Unlike other data-based projects that can be more selective in where data is sourced from and what is taken into account, AI projects utilize huge data sets in order to train AI models, and to distil insight from massive amounts of information to fuel new innovation. This presents major challenges in relation to fully understanding AI models, and how introducing new data to a model can change outcomes.
The issue of repeatability is still being grappled with but a best practice to help understand data models and very large datasets is to introduce ‘checkpointing’ which ensures models can be reverted to an earlier state, effectively turning back time, thereby facilitating a better understanding of the implications of data and parameter changes. The ethical and provenance aspects of using data from the Internet in training models have not yet been sufficiently explored or addressed, nor has the impact of (attempting) removal of selected data from a LLM or RAG vector dataset.
Investing in people
Any organization embarking on an AI journey is going to encounter skills shortages. There simply aren’t enough data scientists or other professionals with relevant skills available in the worldwide workforce at present to cope with demand and as a result, those with the right skills are difficult to secure, and command premium salaries. This is likely to remain a significant issue for the next 5-10 years. As a result, organizations will need to not only invest heavily in talent through hiring, but also invest in training their existing workforce to develop more AI skills internally.
Conclusion
As organizations mature in their adoption of AI, develop specific use cases, fine-tune infrastructure requirements, invest in skills and chart a clear path to short or long term ROI, they might come to realize that the challenges may be very difficult to overcome on their own. For many, partnerships will be required. This is where there’s a real opportunity for cloud service providers, managed service providers and other specialists to offer services and infrastructure that will help organizations realize their AI goals.
This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro
via Hosting & Support
Comments
Post a Comment