American Sign Language (ASL) is the language used by the deaf community in the United States and Canada; and uses hand gestures, or a series of hand gestures, to communicate information. It is estimated that approximately 500,000 people use ASL as their primary language 1 . To expand access to the deaf community, it is necessary for us to develop tools to facilitate communication between the deaf and hearing communities. For this project, I explored the possibility of using a neural network to recognize hand signs in images. Findings This project was conducted using a Kaggle dataset containing approximately 35,000 images of 24 hand signs. Each hand sign represented a different letter of the English alphabet. The letters "J" and "Z" were excluded because their signs require motion and cannot be captured in a single image. The image below shows the hand sign for the letter "C". A Convolutional Neural Network (CNN) was created to extract featu...