Our multi-function app unifies the multi-functional capabilities of InnoColor, living on both iOS and CUDA-supported window/linux devices. This includes Augmented Reality software that can simulate and transform real-time camera input, ambient sensor input, and also a versatile user data-generation interface for future research.
A wearable prototype is also designed. This supports hands-free interfacing, the augmented-reality experience, and integration of the ambient TCS3472 light sensor, which runs on low power and cost. Users will also be able to remote handle the device using go-to controllers such as Xbox One.
InnoColor utilizes a deep-learning guided image transformation to modify visual content, like images, video, and GUI, in both digital software and physical interfaces.