36 skills found · Page 1 of 2
dxli94 / WLASLWACV 2020 "Word-level Deep Sign Language Recognition from Video: A New Large-scale Dataset and Methods Comparison"
loicmarie / Sign Language Alphabet RecognizerSimple sign language alphabet recognizer using Python, openCV and tensorflow for training Inception model (CNN classifier).
shreyasharma04 / HealthChatbot🤖 HealthCare ChatBot Major -1 (4th year - 7th semester) Health Care Chat-Bot is a Healthcare Domain Chatbot to simulate the predictions of a General Physician. ChatBot can be described as software that can chat with people using artificial intelligence. These software are used to perform tasks such as quickly responding to users, informing them, helping to purchase products and providing better service to customers. We have made a healthcare based chatbot. The three main areas where chatbots can be used are diagnostics, patient engagement outside medical facilities, and mental health. In our major we are working on diagnostic. 📃 Brief A chatbot is an artificially intelligent creature which can converse with humans. This could be text-based, or a spoken conversation. In our project we will be using Python as it is currently the most popular language for creating an AI chatbot. In the middle of AI chatbot, architecture is the Natural Language Processing (NLP) layer. This project aims to build an user-friendly healthcare chatbot which facilitates the job of a healthcare provider and helps improve their performance by interacting with users in a human-like way. Through chatbots one can communicate with text or voice interface and get reply through artificial intelligence Typically, a chat bot will communicate with a real person. Chat bots are used in applications such as E-commerce customer service, Call centres, Internet gaming,etc. Chatbots are programs built to automatically engage with received messages. Chatbots can be programmed to respond the same way each time, to respond differently to messages containing certain keywords and even to use machine learning to adapt their responses to fit the situation. A developing number of hospitals, nursing homes, and even private centres, presently utilize online Chatbots for human services on their sites. These bots connect with potential patients visiting the site, helping them discover specialists, booking their appointments, and getting them access to the correct treatment. In any case, the utilization of artificial intelligence in an industry where individuals’ lives could be in question, still starts misgivings in individuals. It brings up issues about whether the task mentioned above ought to be assigned to human staff. This healthcare chatbot system will help hospitals to provide healthcare support online 24 x 7, it answers deep as well as general questions. It also helps to generate leads and automatically delivers the information of leads to sales. By asking the questions in series it helps patients by guiding what exactly he/she is looking for. 📜 Problem Statement During the pandemic, it is more important than ever to get your regular check-ups and to continue to take prescription medications. The healthier you are, the more likely you are to recover quickly from an illness. In this time patients or health care workers within their practice, providers are deferring elective and preventive visits, such as annual physicals. For some, it is not possible to consult online. In this case, to avoid false information, our project can be of help. 📇 Features Register Screen. Sign-in Screen. Generates database for user login system. Offers you a GUI Based Chatbot for patients for diagnosing. [A pragmatic Approach for Diagnosis] Reccomends an appropriate doctor to you for the following symptom. 📜 Modules Used Our program uses a number of python modules to work properly: tkinter os webbrowser numpy pandas matplotlib 📃 Algorithm We have used Decision tree for our health care based chat bot. Decision Tree is a Supervised learning technique that can be used for both classification and Regression problems, but mostly it is preferred for solving Classification problems. It is a tree-structured classifier, where internal nodes represent the features of a dataset, branches represent the decision rules and each leaf node represents the outcome.It usually mimic human thinking ability while making a decision, so it is easy to understand. :suspect: Project Members Anushka Bansal - 500067844 - R164218014 Shreya Sharma - 500068573 - R164218070 Silvi - 500069092 - R164218072 Ishika Agrawal - 500071154 - R164218097
Elysian01 / Sign Language TranslatorSign Language Translator enables the hearing impaired user to communicate efficiently in sign language, and the application will translate the same into text/speech. The user has to train the model, by recording its own sign language gestures. Internally it uses MobileNet and KNN classifier to classify the gestures.
MohamedBayomey / Sign Language DetectionThis project develops a Sign Language Detection system to bridge the communication gap between hearing individuals and the deaf or hard-of-hearing community. Using machine learning and computer vision, the system detects and classifies American Sign Language (ASL) signs in real time, promoting accessibility.
aryclenio / LIBRAS Image ClassifierThis project demonstrates the use of neural networks and computer vision to create a classifier that interprets the Brazilian Sign Language.
pfoy / ASL Recognition With Deep LearningBuild a convolutional neural network to classify images of letters from American Sign Language
AkramOM606 / American Sign Language DetectionA real-time American Sign Language (ASL) detection system using computer vision and deep learning. This project uses a combination of OpenCV, MediaPipe, and TensorFlow to detect and classify ASL hand signs from camera input. The system can recognize a wide range of ASL characters, and can be used to facilitate communication for sign language users.
CodingSamrat / Sign Language RecognitionA Machine Learning model that will be able to classify the various hand gestures used for finger spelling in sign language
evernext10 / Hand Gesture Recognition Machine LearningAutomatic method for the recognition of hand gestures for the categorization of vowels and numbers in Colombian sign language based on Neural Networks (Perceptrons), Support Vector Machine and K-Nearest Neighbor for classifier /// Método automático para el reconocimiento de gestos de mano para la categorización de vocales y números en lenguaje de señas colombiano basado en redes neuronales (perceptrones), soporte de máquina vectorial y K-vecino más cercano para clasificador
Yashk1434 / Hand Sign DetectorA real-time hand sign language detection system built using OpenCV, Flask, and Deep Learning. The system captures live webcam feed, detects hand gestures using the cvzone HandTrackingModule, and classifies the detected signs using a custom-trained Keras model.
Goutam1511 / Sign Language Recognition Using Scikit Learn And CNNThe project aims at building a machine learning model that will be able to classify the various hand gestures used for fingerspelling in sign language. In this user independent model, classification machine learning algorithms are trained using a set of image data and testing is done. Various machine learning algorithms are applied on the datasets, including Convolutional Neural Network (CNN).
webdevpathiraja / Hand Gesture Sign Detection ProjectThis project uses OpenCV and MediaPipe to detect and classify hand gestures in real-time. The system utilizes computer vision and machine learning-based hand tracking to analyze hand landmarks and classify gestures accurately. Ideal for applications in human-computer interaction, gesture-based control, and sign language interpretation.
benlalaraid / Arabic Sign Language Image Classification With CNNThis project aims to build a robust image classification model to recognize and classify Arabic sign language gestures. Using deep learning techniques and convolutional neural networks (CNNs), the goal is to facilitate communication for the deaf and hard of hearing communities by translating sign language gestures into written Arabic text.
KhaledAshrafH / Sign Language Digit RecognizerThis project aims to classify sign language for numbers from 0 to 9 using different neural network architectures (FFNN - LSTM - CNN) and SVM. The goal is to create a sign detector that can recognize and translate sign language gestures to text.
MonzerDev / Real Time Sign Language RecognitionA real-time sign language recognition system utilizing MediaPipe and CNNs to classify static gestures (A-Z, 1-9). Includes custom datasets, preprocessing, training, and evaluation scripts, designed to enhance accessibility for the deaf and hard-of-hearing communities.
olpotkin / DNN Gesture ClassifierDeep Neural Network Gesture Classifier for Russian Sign Language
88448844 / Sign Language DetectorThis is a Python-based sign language detector that uses computer vision and machine learning to recognize and classify sign language gestures in real-time
SomyanshAvasthi / Sign Language Detection Using MediaPipeThis repository implements a sign language detection system using MediaPipe from totally scratch. Manually collected and annotated data. Leveraging MediaPipe's hand landmark detection, the system processes video frames to classify and translate sign language gestures in real-time.
RiaanSadiq / Sign Language DetectionSign Language Detection using CNN and Flask. This project uses Convolutional Neural Networks (CNN) to detect and classify sign language gestures. It includes a Flask web application for real-time detection. Simply use your webcam to translates gestures into text.