- For an endurance sport like cricket, moments of action are often sudden. Less devoted fans may find themselves putting it on as background noise. But AI is here to change that.
- Monty, an AI prediction system, was built to predict certain plays and alert fans when they may be about to happen. In the future, fans may be able to customize which plays they get alerted about.
- Notably, Monty uses Google technology — from the cloud, to its advertising systems — to help Mindshare, its creators, generate ad campaigns that drive people to watch a match when it detects that something exciting is about to happen.
- It could be a sign of automatically-generated, personalized ads yet to come — but the challenge will be to keep it from getting creepy.
- Visit Business Insider’s homepage for more stories.
Would cricket be more interesting if you could tune in just before the most exciting moments? Asking an AI to predict when a wicket will fall finds the highlights of the match – even when the wicket doesn’t fall. AI ads could also find you the perfect socks or persuade you to try something you might hate – if that doesn’t get too creepy.
For the most passionate cricket fans, the rhythm of a game – which can go on for eight hours a day for five days in a test match – is part of the attraction; long periods of solid play that test the endurance and temperament of the players as well as their technique, interspersed with sudden moments of dramatic action, when it might take an action replay to see the details of what happened. But when you’ve paid $600 million for the broadcast rights, you want more than those passionate fans who are prepared to invest time and money to get the most out of the matches, even in as cricket-mad a country as Australia where the match will be playing in every bar.
“If you want to watch every ball, you have to pay $60; that’s like asking an Australian to pay to go to the beach,” Jack Smyth, head of innovation at WPP-owned media consultancy Mindshare, told Business Insider. “Most people have the cricket on as background TV and they drop in and out. Fair-weather fans aren’t going to spend five days in the lounge watching the match, but if we can give them a good reason to duck into the pub or jump on the live stream, there’s an opportunity to get them to buy in.”
To pull them in at just the right moment, Mindshare built an AI prediction system for Fox Sports called Monty, using Google’s AutoML Tables machine learning service, to predict when a wicket is going to be taken up to five minutes in advance, and warn fans to switch on the cricket with a near-real-time “Wicket Warning.”
Smyth’s team trained the system on the data from every game played by the Australian men’s cricket team for a year. For each ball bowled, data provider Opta Sports tracks 83 variables and makes those available within seconds; five of those are just about what kind of ball it is – fast or slow, spin or offside. Then there’s other data like how long the bowler has been in, how long the batsman has been at the crease, are they the first or last bat of the day, what position the fielders are in, even what the weather is like. Where the fielders are tells you a lot about the tactics the captain of the cricket team is using, but not all the data turned out to be useful.
“Lots of fans think the type of pitch is significant, or the weather and the humidity; but Monty didn’t pay attention to the environment, just the mechanics,” Smyth explains.
Predicting individual balls didn’t leave enough time to tell the fans what was about to happen, so instead the team build a classification model giving a confidence value for whether the wicket would be taken and how – would the batsman be LBW, run out or have the ball caught? An early version of the model kept predicting that batsmen would be run out, over and over again.
“That was a bit of a ghost in the machine moment,” Smyth jokes; “the likelihood is remarkably low, but Monty had an unnerving certainty it would happen.”
Smyth says AutoML Tables were ‘remarkably easy to use’ by just tagging the data and the team was able to narrow down the variables that Monty could use to make increasingly accurate predictions. The five minute window was a compromise between predictions being accurate enough to rely on, and arriving soon enough for the team to use them to send push notifications, flash up alerts on digital billboards around the country and decide how much money to bid for online ad impressions through Google Ads.
That’s part of the reason Mindshare chose the Google platform, Smyth says — having the cloud in the same place as the system for buying online ad campaigns made life much easier.
“We chose AutoML because we could built it fairly quickly and we picked
When Monty was “supremely confident,” at the 80% threshold or above, the team sent a push notification directly to fans who had the Fox cricket app. Below that threshold, the prediction would be used to create five second ads with a wicket warning, or for display media like the digital billboards.
“We were making ads on the fly based on what Monty thought would happen next,” says Smyth. But it wasn’t just about getting people to tune in. “It’s an extension of the broadcast, allowing you to plan the rest of your day.” That might go as far as telling you if it’s safe to take a bathroom break.
Hitting a six
Fox is understandably cautious about saying quite how accurate Monty is, because it expects other broadcasters to be creating their own AI predictions. The channel is happier talking about how Monty delivered a 150% increase in subscriptions for the same amount of money spent, and how twice as many people remembered Fox for cricket over the competition. In tests, Monty was up to 91% accurate, but in practice, for test matches, Monty predicted which wickets would fall with 87.2% accuracy.
“For our very first notification, the wicket fell nine minutes after the prediction, not five – but that created an immediate rush of excitement in the product team and among cricket fans worldwide. The experience was so magical that the reaction was less about waiting five minutes for a wicket to fall but whether Monty would be right at all.”
Anyone using Google Assistant, the voice assistant, could ask for a prediction from Monty at any time, and ask how the model had made that prediction.
“We’d see people who were curious about what was going to happen next. It was like having your own personal commentator,” Smyth suggests; “you’re in control of what you’re watching, and you have a new understanding of the game.”
“In the next iteration, fans will be able to customize alerts; not ‘will the wicket fall’ but ‘if it falls will it be worth watching’.”
A future of version of Monty could go even further and give fans their own custom highlights. “You could say ‘I’m more interested in sixes being scored than in wickets’ and we could alert you when certain plays happen. For purists, you could say you don’t want alerts but you want to see if you can beat Monty, based on your understanding of the game and tactics.”
Machine learning models need to be retrained as situations change and that includes how cricketers develop their game. “As players wax and wane, it’s interesting for fans to talk about ‘if he was his normal self, he wouldn’t have taken that errant swing’; you want to know why he was off his game, does this bowler have his number?”
There could also be a version of Monty for fantasy cricket teams, which is an increasingly lucrative area. “You could ask about mythical matches where modern players could come together with the cricketing greats. Or Monty could be the ultimate manager for your fantasy team. You humanly can’t watch every single player so Monty could tap you on the shoulder and say ‘these are three players that caught my eye’.”
AI is customizing advertising at scale
Behind the scenes, AI is already powering advertising at scale, under the name ‘dynamic creative optimization’ explains Karsten Weide, Digital Media and Entertainment Program Vice President at IDC. This technology automatically generates and optimizes the design of display ads to maximize effectiveness — and there’s research on applying it to video.
“Imagine you’re Brand X and you want to address 500 million customers with a campaign, across 45 countries, across PCs, laptops, smart phones, tablets, smart TVs, set-top boxes, across all parts of the funnel with the need to engage in storytelling, accompanying the ‘customer journey,’ Weide says. “And every single customer needs to receive the exact right message at the exact right moment on the appropriate device.”
The volume of data those dynamic ads are based on is increasing – tripling by 2025 with up to a quarter being live data streams, says Weide.
“There’s no way to do this without automation, and no way to do this optimally without AI and ML,” Weide maintains.
So far, fully AI-powered advertising directed at the public has been relatively rare, and it’s more likely to be interactive than video, suggests Steve Guggenheimer, corporate vice president of Microsoft’s AI Business.
“I think there’s a real opportunity to use AI to both be creative and add individuality to advertising with virtual, real-time engagement,” he says. “The more creative or personal you can make things, the more engaging they are and there’s an opportunity to be both more creative and more individual at scale – which is a hard thing to do.”
AR apps let you see IKEA furniture in your own room, Dulux paint colors on your own walls or L’Oreal and Sephora makeup on your own face using AR. Digital creative agency AnalogFolk used image recognition to create an “Eat Your Feed” ad for Knorr suggesting personalised recipes based on photos in your Instagram feed: skiing photos get you a hearty lamb hotpot to warm you up, snowboarding action shots match a ‘cardio boosting One Pot Mushroom Ragout with Fusilli and Spinach’.
More sophisticated is the TasteFace app the AnalogFolk team built for Marmite; the sticky, savory spread that glories in the reputation that you either love it or hate it. TasteFace used the Microsoft Emotion API to analyse the reactions of people tasting free samples of Marmite, converting expressions of anger, contempt, disgust, fear, happiness, sadness, surprise or no reaction into a score for love or hate, and a personalized reaction animation it could share.
That kind of AI-powered ad could work with lots of kinds of interactions, Guggenheimer says.
“The hook is to interactively understand what people say or the gestures they’re making or the expression on their face.” But it needs to be done cautiously and without so much tracking that it becomes intrusive.
He compares a digital agent that tells you which aisle the printer cartridges are in or compliments your shoes and suggests a cool pair of socks to go with them, with coupons targeting you based on past purchases. Shoppers don’t want an experience like Target telling a girl’s father she’s pregnant or Tom Cruise in Minority Report, trying to hide in a smart shop that announces his name and the shirts he last bought. “Doing something that captures your attention as an individual is great; stalking me is not.”
Join the conversation about this story »
NOW WATCH: Most hurricanes that hit the US come from the same exact spot in the world