IT - AI

Artificial intelligence requires greater processor density, which increases the demand for cooling and raises power requirements

"As artificial intelligence takes off in enterprise settings, so will data center power usage. AI is many things, but power efficient is not one of them.

For data centers running typical enterprise applications, the average power consumption for a rack is around 7 kW. Yet it's common for AI applications to use more than 30 kW per rack, according to data center organization AFCOM. That's because AI requires much higher processor utilization, and the processors - especially GPUs - are power hungry. Nvidia GPUs, for example, may run several orders of magnitude faster than a CPU, but they also consume twice as much power per chip. Complicating the issue is that many data centers are already power constrained..."


The potential benefits of AI for businesses are huge

"But those benefits could be eroded by hidden biases that damage brand reputation and customer trust, according to a survey of US and UK businesses commissioned by DataRobot.

According to DataRobot's 'The State of AI Bias in 2019,' which was released last week, 42% of organizations surveyed reported being 'very to extremely' concerned about AI bias occurring in their organizations..."


When computers were introduced to businesses in the 1950s and 60s, some people looked at them in fear

''They will replace us all,' they cried. Today, the computer industry is a pillar of our economy, employing millions and propelling society forward. Ironically, some of the same people who have made billions in those industries are warning us that 'we will all lose our jobs to AI and robots,' and believe that one of the main sectors under threat is manufacturing.

In the past decade, numerous European and American cities have lost millions of manufacturing jobs. The city I was born in, Monfalcone, Italy, renowned for building massive cruise ships, is one of them. However, in my city none of those jobs were lost to AI, robots or automation..."


Data security and automation are the top IT projects for 2020, while artificial intelligence projects are not in the top 10 for IT professionals, according to Netwrix

"The online survey asked 1045 IT professionals worldwide to name their top five IT projects for the next year; they could pick from a predefined list or specify their own descriptions. The survey found no dramatic difference in IT priorities among organizations based on size or vertical..."

As enterprises embark on AI and machine learning strategies, chip makers like NVIDIA, Intel and AMD are battling to become the standard hardware providers

"Artificial intelligence is becoming an integral feature of most distributed computing architectures. As such, AI hardware accelerators have become a principal competitive battlefront in high tech, with semiconductor manufacturers such as NVIDIA, AMD, and Intel at the forefront.

In recent months, vendors of AI hardware acceleration chips have stepped up their competitive battles. One of the most recent milestones was Intel's release of its new AI-optimized Ponte Vecchio generation of graphical processing units (GPUs), which is the first of several products from a larger Xe family of GPUs that will also accelerate gaming and high-performance computing workloads..."


Sniffing Out Errors
insideBIGDATA, November 26th, 2019
As seasoned analysts will know; it can be difficult to identify when to draw a line under your Predictive Modelling, accept its performance as sufficient for your purposes and move on to deployment

"Analysts and Data Scientists will be familiar with examining residual plots for their models and looking for outlier errors that may indicate that the model's underlying assumptions have been broken, or that some of the data points might be extreme outliers that cause grave problems when trying to build a model on the whole data set.

But while examining residual plots is great from a qualitative point of view, as data natives, we should always be looking for quantitative methods for describing, classifying and understanding these errors.

What we need is a statistical analysis, that fulfills our desire to quantitatively understand the weaknesses in our models. One simple practice which meets this need and can help the indecisive data-practitioner to direct and allocate their limited time is Error Analysis..."

See all Archived IT - AI articles See all articles from this issue