The Big Health Data–Intelligent Machine Paradox

Slides:



Advertisements
Similar presentations
Richard Socher Cliff Chiung-Yu Lin Andrew Y. Ng Christopher D. Manning
Advertisements

Appendix B: An Example of Back-propagation algorithm
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
Convolutional LSTM Networks for Subcellular Localization of Proteins
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. SHOW.
Low-Dose Aspirin in Patients with Stable Cardiovascular Disease: A Meta-analysis Jeffrey S. Berger, MD, MS, David L. Brown, MD, Richard C. Becker, MD The.
Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation EMNLP’14 paper by Kyunghyun Cho, et al.
Bassem Makni SML 16 Click to add text 1 Deep Learning of RDF rules Semantic Machine Learning.
S.Bengio, O.Vinyals, N.Jaitly, N.Shazeer
Attention Model in NLP Jichuan ZENG.
Big data classification using neural network
Neural Machine Translation
Convolutional Sequence to Sequence Learning
Unsupervised Learning of Video Representations using LSTMs
Neural Network Architecture Session 2
CS 388: Natural Language Processing: LSTM Recurrent Neural Networks
CS 4501: Introduction to Computer Vision Computer Vision + Natural Language Connelly Barnes Some slides from Fei-Fei Li / Andrej Karpathy / Justin Johnson.
Deep Learning Amin Sobhani.
Recurrent Neural Networks for Natural Language Processing
Neural Machine Translation by Jointly Learning to Align and Translate
Show and Tell: A Neural Image Caption Generator (CVPR 2015)
An Overview of Machine Translation
A Practical Framework Toward Prediction of Breaking Force and Disintegration of Tablet Formulations Using Machine Learning Tools  Ilgaz Akseli, Jingjin.
Intro to NLP and Deep Learning
ICS 491 Big Data Analytics Fall 2017 Deep Learning
Artificial Intelligence fuelled Sentiment Analysis
Deep Learning with TensorFlow online Training at GoLogica Technologies
Intelligent Information System Lab
Different Units Ramakrishna Vedantam.
Synthesis of X-ray Projections via Deep Learning
Neural Machine Translation By Learning to Jointly Align and Translate
Dynamic Routing Using Inter Capsule Routing Protocol Between Capsules
Attention-based Caption Description Mun Jonghwan.
A critical review of RNN for sequence learning Zachary C
Grid Long Short-Term Memory
Paraphrase Generation Using Deep Learning
Image Captions With Deep Learning Yulia Kogan & Ron Shiff
A First Look at Music Composition using LSTM Recurrent Neural Networks
Vessel Extraction in X-Ray Angiograms Using Deep Learning
network of simple neuron-like computing elements
Understanding LSTM Networks
Medical professionalism and the generation gap
Notifications Received by Primary Care Practitioners in Electronic Health Records: A Taxonomy and Time Analysis  Daniel R. Murphy, MD, MBA, Brian Reis,
Other Classification Models: Recurrent Neural Network (RNN)
Papers 15/08.
Recurrent Encoder-Decoder Networks for Time-Varying Dense Predictions
Prediction of in-hospital mortality after ruptured abdominal aortic aneurysm repair using an artificial neural network  Eric S. Wise, MD, Kyle M. Hocking,
Machine Translation(MT)
实习生汇报 ——北邮 张安迪.
Presentation By: Eryk Helenowski PURE Mentor: Vincent Bindschaedler
Please enjoy.
Pulmonary Langerhans' Cell Histiocytosis
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
Attention for translation
Learn to Comment Mentor: Mahdi M. Kalayeh
Jointly Generating Captions to Aid Visual Question Answering
Automatic Handwriting Generation
Lecture 09: Introduction Image Recognition using Neural Networks
Presented by: Anurag Paul
Deep Neural Networks as Scientific Models
Neural Machine Translation using CNN
Direct Observation of Residents: A Model for an Assessment System
Presented By: Harshul Gupta
Falls in Older Adults: Risk Assessment, Management and Prevention
Recurrent Neural Networks
Sequence-to-Sequence Models
Pose Estimation in hockey videos using convolutional neural networks
Bidirectional LSTM-CRF Models for Sequence Tagging
Week 7 Presentation Ngoc Ta Aidean Sharghi
CVPR 2019 Poster.
Presentation transcript:

The Big Health Data–Intelligent Machine Paradox D. Douglas Miller, MD, CM, MBA  The American Journal of Medicine  Volume 131, Issue 11, Pages 1272-1275 (November 2018) DOI: 10.1016/j.amjmed.2018.05.038 Copyright © 2018 Terms and Conditions

Figure 1 (A) Recurrent neural network (RNN) architecture makes use of sequential information. RNNs are called recurrent because they perform the same task for each element of a sequence, with the output being dependent on the prior computations. This creates a short-term ‘memory’ functionality that captures information about the prior calculations. This simple RNN is unrolled into a neural network of 3 layers designed to decode a 3-word phrase; the input at the time step (t) is a vectorial representation of word 2 in the phrase. The main feature of an RNN is the so-called hidden state, which comprises the interconnected memories at each time step (the blue arrows from and to the gray boxes). This memory is actually a mathematical function calculated based on the previous hidden state at time t – 1 and the current input at time t. The final output is a vector of probabilities of word 3 in the phrase from a vocabulary of choices available at time t + 1. (B) For an RNN to predict the next word in a sentence (ie, language modeling), it is helpful to know which words came before it. In these 2 sentences, a multilayer neural network is used to sequentially predict the next word from the unrolled RNN's hidden state memory of prior layers’ outputs and the current input (ie, “I have a pen. I have an ???”). Performing the same tasks at each step in the sequence with different inputs generates a vector of mathematical probabilities (ie, a generative model) that the final word in the second sentence is apple and not pen, red, or hello. High-probability sentences are typically correct (ie, “I have an apple”). This explains (in part) how RNNs (and more sophisticated long short-term memory units) can successfully carry out natural language processing tasks like reading a medical record. The American Journal of Medicine 2018 131, 1272-1275DOI: (10.1016/j.amjmed.2018.05.038) Copyright © 2018 Terms and Conditions

Figure 2 Researchers registered with CLEF dev set can access open-source captioned image databases (ie, plant specimen photos, digital medical images) and submit requests to use large training data sets to run artificial intelligence analytics.10 The upper run illustrates training a sentence-generating model on an ImageCLEF dev set. The lower run first trains the model on a Microsoft MS COCO image data set, then tests it on the ImageCLEF dev set. An algorithm scoring system (ie, METEOR) is used to assess the performance of different image-captioning software for concept detection (ie, a higher METEOR score equals better concept detection). Concept-based sentence re-ranking can then be applied on sentences generated by these LSTM-RNN models. The outcome sentence, “A plant with pink flower and brown stem…,” reflects the transformed hidden state description of the original image as generated by the neural network system. CLEF = Cross Language Evaluation Forum; CNN = convolutional neural network; LSTM = long short-term memory unit; METEOR = Metric for Evaluation of Translation with Explicit Ordering; MS COCO = Microsoft Coco framework; RNN = recurrent neural network. The American Journal of Medicine 2018 131, 1272-1275DOI: (10.1016/j.amjmed.2018.05.038) Copyright © 2018 Terms and Conditions