Why Distributed Edge Data is the Future of AI (Oct. 3rd)
IT - Operations

JumpCloud Inc.'s newest (SME) IT Trends Report, 'Flexibility and Ingenuity: What's Powering Small and Medium-Sized Enterprise IT Management in 2023', shows that just after SMEs successfully established the new workplace normal following the pandemic, significant turbulence in the greater macroeconomic environment has threatened to upend the system again.

Instead of lockdowns and supply chain shortages, businesses now deal with layoffs and recession fears on top of external threats growing in sophistication, regulatory and compliance pressures heating up, and increasingly complex IT tool sprawl.

JumpCloud commissioned this biannual survey of SME IT admins to gain unique insights into the day-to-day experiences of IT professionals who power and secure operations without enterprise-level budgets and staff. The most recent survey results, polled from admins in the US, UK, and France, highlight that while IT teams are successfully managing the workplace, they need an IT environment built around an open directory platform.

Generative artificial intelligence models such as ChatGPT are known to consume vast amounts of energy, but few people are aware of the enormous amounts of water required to keep them up and running too.

n their latest environmental reports, Microsoft Corp. and Google LLC - two of the leading players in the generative AI industry - reported a massive spike in the water consumption of their data centers, which experts attribute to the growing popularity of AI models.

An article in the Associated Press published Saturday reported that Microsoft's data center water use increased by 34% from 2021 to 2022. The company slurped up more than 1.7 billion gallons, or 6.4 billion liters, of water last year, which is said to be enough to fill more than 2,500 Olympic-sized swimming pools.

It was a similar story with Google, which reported a 20% spike in its water consumption over the same timeframe.

What's Next For Observability?
InfoWorld, Tuesday, September 12,2023
Today's systems are exposing more of their underlying complexity to operators. These are the most exciting new developments along the journey of taming that complexity.

The concept of observability traces back to the 1960s, with Rudolf E. Kalman's canonical work around decomposing complex systems for human understanding. It was a heady time for new compute systems in aerospace and navigation. The advances in these systems exceeded humans' ability to reason about them, and Kalman's work is largely credited for laying the foundation for observability theory.

See all Archived IT - Operations articles See all articles from this issue