Skip to main content

Considerations for your next AI implementation

Web Hosting & Remote IT Support

At present we’re seeing two divergent approaches to the enterprise adoption of AI. Some organizations are racing to implement solutions in an attempt to generate rapid ROI, while others are taking a longer-term view, hoping to reap future rewards based on long term research investments made now. 

Regardless of where an organization may be on its AI journey, there are common challenges to be faced, including skills shortages, power usage, supply chain issues and budgetary constraints, with any significant AI implementation starting at £10 million. It’s vital that any organization considering an AI implementation invests in the right resources and technologies from the outset to avoid painful, and expensive, headaches further down the line.

Clear trend of investment in AI

According to figures published by Statistica, an estimated $934.2 billion was invested by companies in AI technologies between 2013 to 2022, rising steadily year on year. The advent of generative AI has further compounded AI spending over the past year, with major tech firms including Microsoft, Google and Amazon leading the way, outpacing investment from Silicon Valley VC firms, according to the Financial Times. In addition, a recently published McKinsey report called 2023 a ‘breakout year’ for generative AI, with a third of respondents surveyed stating their organizations are using the technology regularly in at least one business function. 

Despite the clear trend of investment in AI, many organizations are finding large scale AI implementations cost prohibitive at present. In addition to IT infrastructure and people related costs, it’s necessary to also factor-in environmental impact and energy usage. Costs could be a temporary constraint for some, but organizations will need to have a clear path to monetization and ROI of their AI project to justify the spend, purchase the necessary infrastructure and offset carbon emissions to meet regulatory requirements.

Putting the right foundations in place

Regardless of the challenges, the transformative benefits and value of successful AI projects are too great to ignore. Most sectors are still in the early adopter phase of AI implementation, but adoption is only increasing as use cases are defined and we move beyond the conservative thinking that prevails within many organizations. In preparation for this shift, now is the time to start thinking about what is required to ensure solid foundations are in place for an AI-based future.

To enhance the prospects of successful AI implementation, these are the key things that organizations need to be thinking about:

Accessibility of GPUs

Supply chains need to be assessed and factored into any AI project from the outset. Access to GPUs is crucially important as without GPUs, your AI project is not going to succeed. As a result of the huge demand for GPUs and their resulting lack of availability on the open market, some organizations planning AI implementations may need to look to hosting service providers for access to the technology.

Data center power and space capabilities

AI, and its massive datasets, create real challenges for already stretched data centres, particularly in relation to power. Today’s AI implementations can demand power density of 40 to 50 kilowatts per rack – well beyond the capability of many data centres. AI is changing the network and power requirements for data centres. A much higher density of fibers is required, together with greater, higher-speed networking than traditional data center providers can cope with. 

Power and space efficient technologies will be crucial to successfully getting your AI project off the ground. Flash based data storage technology can help mitigate this problem, as it is far more power and space efficient than HDD storage and requires less cooling and maintenance than traditional hard drives. Every Watt allocated to storage reduces the number of GPUs that can be powered in the AI cluster.

Data challenges

Unlike other data-based projects that can be more selective in where data is sourced from and what is taken into account, AI projects utilize huge data sets in order to train AI models, and to distil insight from massive amounts of information to fuel new innovation. This presents major challenges in relation to fully understanding AI models, and how introducing new data to a model can change outcomes. 

The issue of repeatability is still being grappled with but a best practice to help understand data models and very large datasets is to introduce ‘checkpointing’ which ensures models can be reverted to an earlier state, effectively turning back time, thereby facilitating a better understanding of the implications of data and parameter changes. The ethical and provenance aspects of using data from the Internet in training models have not yet been sufficiently explored or addressed, nor has the impact of (attempting) removal of selected data from a LLM or RAG vector dataset.

Investing in people

Any organization embarking on an AI journey is going to encounter skills shortages. There simply aren’t enough data scientists or other professionals with relevant skills available in the worldwide workforce at present to cope with demand and as a result, those with the right skills are difficult to secure, and command premium salaries. This is likely to remain a significant issue for the next 5-10 years. As a result, organizations will need to not only invest heavily in talent through hiring, but also invest in training their existing workforce to develop more AI skills internally.

Conclusion

As organizations mature in their adoption of AI, develop specific use cases, fine-tune infrastructure requirements, invest in skills and chart a clear path to short or long term ROI, they might come to realize that the challenges may be very difficult to overcome on their own. For many, partnerships will be required. This is where there’s a real opportunity for cloud service providers, managed service providers and other specialists to offer services and infrastructure that will help organizations realize their AI goals.

We feature the best AI tools.

This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro



via Hosting & Support

Comments

Popular posts from this blog

Microsoft, Google, and Meta have borrowed EV tech for the next big thing in data centers: 1MW watercooled racks

Web Hosting & Remote IT Support Liquid cooling isn't optional anymore, it's the only way to survive AI's thermal onslaught The jump to 400VDC borrows heavily from electric vehicle supply chains and design logic Google’s TPU supercomputers now run at gigawatt scale with 99.999% uptime As demand for artificial intelligence workloads intensifies, the physical infrastructure of data centers is undergoing rapid and radical transformation. The likes of Google, Microsoft, and Meta are now drawing on technologies initially developed for electric vehicles (EVs), particularly 400VDC systems, to address the dual challenges of high-density power delivery and thermal management. The emerging vision is of data center racks capable of delivering up to 1 megawatt of power, paired with liquid cooling systems engineered to manage the resulting heat. Borrowing EV technology for data center evolution The shift to 400VDC power distribution marks a decisive break from legacy sy...

Google’s AI Mode can explain what you’re seeing even if you can’t

Web Hosting & Remote IT Support Google’s AI Mode now lets users upload images and photos to go with text queries The feature combines Google Gemini and Lens AI Mode can understand entire scenes, not just objects Google is adding a new dimension to its experimental AI Mode by connecting Google Lens's visual abilities with Gemini . AI Mode is a part of Google Search that can break down complex topics, compare options, and suggest follow-ups. Now, that search includes uploaded images and photos taken on your smartphone. The result is a way to search through images the way you would text but with much more complex and detailed answers than just putting a picture into reverse image search. You can literally snap a photo of a weird-looking kitchen tool and ask, “What is this, and how do I use it?” and get a helpful answer, complete with shopping links and YouTube demos. AI Eyes If you take a picture of a bookshelf, a plate of food, or the chaotic interior of your junk...

Passing the torch to a new era of open source technology

Web Hosting & Remote IT Support The practice of developing publicly accessible technologies and preventing monopolies of privately-owned, closed-source infrastructure was a pivotal technological movement in the 1990s and 2000s. The open source software movement was viewed at the time as a form of ‘digital civil duty’, democratizing access to technology. However, while the movement's ethos underpins much of today’s technological landscape, its evolution has proven to be a challenge for its pioneers. Hurdles Facing Young Developers Open source models successfully paved a path for the development of a multitude of technologies, cultivating a culture of knowledge sharing, collaboration , and community along the way. Unfortunately, monetizing such projects has always been a challenge, and ensuring contributors are compensated for their contributions working on them, even more so. On the other hand, closed-source projects offer greater control, security, and competitive advant...