AI Business is part of the Informa Tech Division of Informa PLC
This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 3099067.
This week, we talk about Google: one of the world’s largest IT companies is frequently in the news, and not always for the right reasons.
We find out what happens behind the scenes from our guest Roland Szabo, a software engineer who started his career by developing machine learning services at Google, before striking out on his own as an ML consultant.
We start by taking a deep dive into Google’s hiring process, and find out some of the benefits and drawbacks of working for a large corporation. We discuss working hours, cross-team collaboration, and of course, the office environment. We’re talking themed micro-kitchens here.
Next, we debut a brand new segment called ‘in fairness to Google’ in which we look at whatever new controversy is brewing at the company, and jump to its defence – or at least, we try. This week, we’re dealing with Google’s position on unions, the dismissal of AI ethics researcher Timnit Gerbu, and ask whether machine learning models can be too big.
And finally we talk about clouds, chips, and the fragmentation of AI infrastructure stacks. Every major cloud vendor seems set on doing machine learning its own way – and startups keep developing new and exciting, but rarely compatible, hardware for AI. Where does it end – and how do you make a decision about your own servers?
Also in this episode: Reasons to like Google+! Unexpected benefits of GitHub! The horrors of Terms of Service!
As always, you can find the people responsible for the circus podcast online: