Augmented Reality

Augmented Reality (AR) turns the environment around into digital platform placing virtual objects in the real world, in real-time. AR fits into a variety of uses including gaming, digital retail, navigation, design.

Our research group is utilizing AR to estimate the “safety aura” around the robot which can contribute to make the interaction between the human and robot more safer. The application developed in ARMS laboratory is used for generating an AR environment to augment virtual information about safety into the view of the user during collaborative work with the Variable Stiffness Actuated (VSA) robots. Virtual objects of safe/danger zones are generated based on the features identified within the real-world environment in the work-space of the robot.

All Publications

Deep Learning Based Object Recognition Using Physically-Realistic Synthetic Depth Scenes.

Baimukashev, D.; Zhilisbayev, A.; Kuzdeuov, A.; Oleinikov, A.; Fadeyev, D.; Makhataeva, Z.; and Varol, H. A.

Machine Learning and Knowledge Extraction, 1(3): 883–903. 2019.

Human grasping database for activities of daily living with depth, color and kinematic data streams.

Saudabayev, A.; Rysbek, Z.; Khassenova, R.; and Varol, H. A.

Scientific Data, 5: 180101 EP -. 05 2018.

Vital sign monitoring utilizing Eulerian video magnification and thermography.

Aubakir, B.; Nurimbetov, B.; Tursynbek, I.; and Varol, H. A.

2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pages 3527-3530, Aug 2016.

ChibiFace: A sensor-rich Android tablet-based interface for industrial robotics.

Nurimbetov, B.; Saudabayev, A.; Temiraliuly, D.; Sakryukin, A.; Serekov, A.; and Varol, H. A.

2015 IEEE/SICE International Symposium on System Integration (SII), pages 587-592, Dec 2015