search.noResults

search.searching

dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
KINDRED GROUP


MICHAEL FRANKLIN


The product is currently in prototype stage and not yet available to customers. Initially created as a Google Action, the underlying Teneo technology is fully agnostic, meaning it can plug-in to Alexa, Siri, Cortana and whichever smart assistant comes next. Michael Franklin: “We wanted to explore whether smart speakers could facilitate voice-driven betting. There are natural language challenges and limitations within the still nascent voice space, but together with Artificial Solutions we used powerful natural language technology to create a prototype to test whether it was possible to overcome these problems and deliver a great customer experience – allowing someone to shout out “Hey Google, stick twenty quid on Spurs to win two nil” and their bet be placed.”


As the primary smart speakers, the team focused on the Google Home and Amazon Alexa. Each has its own in-built Automatic Speech Recognition (ASR) which directly impacts the capability of the Skill or Action; what phrases and terms does the device understand, and how does context impact this.  input large data sets to appreciate the ASR – not only what they understand, but importantly what they got wrong. This resulted in over 4000 betting phrases being said to both Alexa and Google to build a database. Michael explains: “We quickly found that of the two devices Amazon Alexa performed poorly in a betting context, with only 10% of phrases heard correctly. It also picked up 1084 team variations from the 30 team names and knicknames we used; for example, ‘Watford’ was heard as ‘Waterford’, ‘What read’ and ‘Wackford’, along with an additional 43 variants. “Google Home’s system performed noticeably better in comparison due to its history as a search engine and a broader ASR which could understand more sports and betting language. Despite this, it picked up only 43% of the phrases, hearing “Place a tenner” as “Play Santana” or “Place a bet on” as “Play Sabaton”. However importantly when it was wrong, it was consistently wrong, meaning we had confidence in what the input was supposed to be and could cover for mistakes within the natural language engine. “Neither Google Home not Amazon Alexa have been purposefully developed to understand betting


phrases or facilitate betting functionalities, so we had to build and create a broader language ontology within the NLP engine to cover for the mistakes and deliver a working prototype.”


NATURAL LANGUAGE COMPLICATIONS Following the work previously done developing the


Unibet Facebook Chatbot, the team’s Natural Language engine already understood the nuanced lexicon and variation in ways that users could request bets – from the team names, to scores or final outcomes. Phrases such as “Stick a tenner on the Gunners” or “what are the returns on BTTS” don’t easily translate to other areas of life and are specific to gambling. When combining this nuance and complexity with an ASR that didn’t correctly understand betting phrases, it raises the ambiguity of inputs even further.


“When speaking about sport and placing bets with a human, there is an assumption that those in the conversation have a certain level understanding to discuss particular topics,” explains Michael, “as well as background knowledge including the assumption of team popularity and likelihood of winning a game. “City to win”, or “Spurs to lose midweek” are obvious phrases which a human would understand. To overcome this potential minefield and make our device truly conversational, we had to build NLP that familiarised and learnt from the knowledge of past bets, kick off times and availability of odds, so it could provide contextually relevant information.”


Thought still a prototype, through innovative experimentation the team has developed a Google Action that can overcome the ASR and natural language barriers to show betting is possible through a voice interface. “Voice is certainly a hard interface to work with and there are still challenges to overcome – including important steps around identification and authentication of users – but through leveraging powerful NLP, we’ve been able to prove smart speakers could be the next interface for betting and positioned Kindred Group at the forefront for where the future may be heading.“


Will Mace


Neither Google Home not Amazon Alexa have been purposefully developed to understand betting phrases or facilitate betting functionalities


CIO FEBRUARY 2019 77


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64  |  Page 65  |  Page 66  |  Page 67  |  Page 68  |  Page 69  |  Page 70  |  Page 71  |  Page 72  |  Page 73  |  Page 74  |  Page 75  |  Page 76  |  Page 77  |  Page 78  |  Page 79  |  Page 80  |  Page 81  |  Page 82  |  Page 83  |  Page 84  |  Page 85  |  Page 86  |  Page 87  |  Page 88  |  Page 89  |  Page 90  |  Page 91  |  Page 92  |  Page 93  |  Page 94