search.noResults

search.searching

dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
Informatics


human activity in more efficient ways and automating dull repetitive work. But AI also has some genuinely new applications that could trans- form how drug discovery is done, thanks to the way it approaches data. For example, machine learning may discover


patterns that humans would not, and come up with solutions no sane human would, but which nevertheless work better than anything humans have discovered. We only need to look at other industries to see the


potential. NASA used generative algorithms to design an antenna which met a set of criteria – the result would never have occurred to a human, but it was better aligned to their needs than anything a human came up with. Airbus is using similar approaches to design improved aircraft parts (non- safety-critical ones for the moment). Electronic cir- cuit design algorithms outperform human designers. We may soon see this kind of creativity applied


to drug design routinely – setting requirements and letting AI loose on data to try to find solutions that meet them. Doing so may take a change of mind- set, and may involve going through a few crazy ideas, and refining the approach many times, before finding useful ones. This is a particular advantage of machine learn-


ing. Because the algorithm itself does not have human biases it will search for the best way to meet criteria, and will search in places humans would not. Some of these will be absurd and humans will need to refine the models to redirect thinking. But – if other industries are anything to go by – some will hit on genuinely new and inno- vative solutions. While humans are put off by approaches outside their conventional thinking, AI can open up new routes to the desired result not previously considered.


AI in practice: applying language translation AIs to creating new pharmaceuticals To illustrate how AI can deliver benefits in drug design let us take a real example, which Tessella was involved in and was recently published in the Journal of Chemical Information and Modelling. A common drug design approach is to use a


higher-level description of what we want a molecule to look like. One such description is the reduced graph, which involves specifying in chem- istry terminology what structure the molecule should have: for example “an aromatic ring con- nected to a linker, which in turn is connected to an aliphatic ring acceptor, which in turn will poten- tially be connected to several other molecular sub-


Drug Discovery World Fall 2019


structures with different characterisations”. This high-level description is useful because it


limits the search for molecules to those which meet specified criteria, ie having a similar structure to a known active compound. Creating a reduced graph for a known molecule is easy; the bigger challenge is the opposite process – finding good potential molecules which match the desired reduced graph. It is a bit like buying a house: if your criterion is


‘any house’, you will never find what you are look- ing for. But if you specify location, number of bed- rooms and price, you have a better chance. Specifying the reduced graph of a molecule is like providing a detailed layout for your ideal home. However, while there are a million or so property ads online, the number of molecules available for drug design is around 1,060, with the overwhelm- ing majority never having been synthesised in a laboratory. This challenge of generating a set of candidate


molecules from a reduced graph description is something AI can help with. Remarkably, we found that this problem can be related to a completely separate AI challenge: translating languages. Language translation takes advantage of two


cutting-edge developments in neural networks: ‘sequence-to-sequence learning’ and ‘attention mechanisms’. Sequence-to-sequence learning takes a sequence


of words – a sentence in English – and outputs another sequence of words – a translation in French. Languages have very different structures, which is why successful machine learning approaches consider sentences in their entirety and generate a new sentence which captures the whole meaning. It is, of course, also useful to know that particu-


lar words in each language relate to each other, and this is where the attention mechanism comes in. Attention mechanisms allow the model to focus on particular words in the input sentence when gener- ating particular words in the output. Together, this allows translations in which the


right words are selected, but also capture the cor- rect overall meaning. A molecule can be represented as a text sequence


using a SMILES string. The same is true of the high-level reduced graph capturing the outline of what the molecule should look like. We created an approach which applied the same basic principles of language translation to ‘translate’ the outline of a molecule into a specified novel molecule match- ing the outline. In other words, to predict a molecule to match our requirements.


31


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64  |  Page 65  |  Page 66  |  Page 67  |  Page 68