6 skills found
saoudrizwan / PianoEasily play combinations of sound effects and Taptic Engine vibrations on iOS.
ArunMichaelDsouza / Joypad.jsJavaScript library that lets you connect and use various gaming controllers with browsers that support the Gamepad API. Less than 5KB in size with zero dependencies and support for button press, axis movement events and vibration play effect.
DanielStormApps / VibratorPlay system vibrations and Apple Haptic Audio Pattern (AHAP) files in your iOS applications.
rsar97 / Adaptive Filtering Technique For Error Detection And Correction Of Precision Welding RobotAdaptive Robotics and Automation have played a major role in the field of automobile manufacturing, space research, logistics, agriculture and many more. One such robot is a welding robot which is programmed to weld a product in the automotive industry. These robots are very accurate and don’t have any errors. But, sometimes due to some vibrations in the motors or due to any external factors, the robot may deviate from its specified position which leads to defective welding of a product. The robot arm is subjected to object tracking. The position of the robot arm is tracked by mounting an accelerometer on the robot arm. This position deviation can be corrected by using Kalman filtering technique. Kalman filters are used in the field of robotics motion planning, control and trajectory optimization. A com-mon application is for state prediction and estimation, object tracking. This paper is about applying Kalman filtering technique to a three-axis accelerometer which is mounted on the robotic arm of a welding robot. The voltage values of the accelerometer sensor are taken for state prediction and by recursive iterations the values are optimized such that error becomes minimum when the robot has deviated from its desired position.
azb / UnityVRTutorialMakerA VR Tutorial Creator tool for Unity that lets you super easily create tutorials for how to play your game and helps users figure out the different controller elements and how to use them. It helps with stuff like highlighting the right button / joystick on different platforms, handling vibration, has prefabs for showing tutorial tips that point to those buttons, ect... and works on all platforms automatically like Vive, Rift, Index, Quest, Go, ect..
sgandhi04 / K9Parks, downtowns, malls, and stores are places that we visit frequently in our day to day lives. These public venues are used for a multitude of different things such as socializing, dining, shopping, playing, etc. Therefore, it is important that these venues are easily accessible to EVERYONE. Out of the 7.5 billion that live on today's planet, around 285 million individuals suffer from visual impairment. Therefore, it is crucial that we make public venues accessible to 3% of our world’s population. This project focuses on creating a navigational aid, leveraging computer vision, artificial intelligence, robotics, and a variety of sensors, to make an ideal assistive technology for the visually impaired which they can use in public environments. The K9 includes many features, listed below: A robotic navigation guide vehicle aid that would help the visually impaired and elderly navigate public indoor/outdoor surroundings Easily able to control the speed of the guide vehicle Able to move on a predetermined path. The guided vehicle can move independently by detecting & avoiding obstacles Able to Identify obstacles/various objects Able to provide sound feedback I decided to create a device on the Arduino platform, using a cheap computer vision camera and vibration motors for obstacle detection. After some research, I discovered the low-cost cmuCam5 Pixy Cam computer vision camera that is capable of recording signatures of objects. The device was able to detect pre-programmed obstacles, by its hue. Leveraging the Pixy Cam, ultrasonic sensor, and line follower, I created a device that can navigate a user around a store. This product can follow a predetermined path, avoid obstacles and come back on the path, and beep when it finds a specific object. Not only does this product navigate the user around a public environment, but it also identifies specific objects (i.e. tomato). In order to give more control of the robot to the user, a hand dynamometer was made that would allow the robot to change its speed based on the strength of one's hand. To test my product, I replicated an indoor public environment using toy food, wood for aisles, and electrical tape for the predefined path. I tested my product three times for each nature of object detection: object on left, right, and both for a total of 9 trials. K-9 was 80% successful for objects on the left, 84.6% on the right, and 79% on left and right. From my data, as well as qualitative observations, I can conclude that this product has the potential to help guide visually impaired individuals in public surroundings. Although it meets all of my criteria with an 82% accuracy, it will need to reach a 100% accuracy to hit the mainstream. In the future, I hope to leverage other types of computer vision cameras such as Google AIY to aid further in object identification.