Lumino

Host department: Department of Electrical and Electronic Engineering

Project supervisor: Dr. W.L. Tam (Department of Electrical and Electronic Engineering)

Project background

Lumino is a cost-efficient one-stop solution to the barrier-free movement for the visually impaired.

Guide dogs and white canes are currently the most common forms of navigational aid for the visually impaired. Common, not effective: the severe shortage of only 50 guide dogs in HK notwithstanding, prior training is time-consuming, and social stigma is still a prevalent concern. The white cane is instrumental in detecting immediate hazards, but the radius of safety is limited to its length, and the true nature of the detected obstructions is unknown to the user, resulting in constant uncertainty and unresolved safety risks.

We live in an urban setting overwhelmingly designed for sighted people, making environment traversal one of the most pressing concerns for the visually impaired. We believe that visual impairments should not restrict one’s autonomy to navigate their environment hindrance-free. That is why we came up with Lumino: in itself a guide dog, white cane and much more. With smart-object identification, hazard detection and real-time GPS navigation backed up by a novel machine learning algorithm, coupled with intuitive audio and haptic feedback and complemented by compact hardware, Lumino empowers the visually impaired, one confident stride at a time.

Project leader: Wong Kwong Yat Felix, BEng (CE)

Team member(s): Wong Kwong Yat Felix, BEng (CE), Wong Chi Ping Desmond, BEng (CE), Yu Shing Chit Alvin, (BA&LLB)

Project details and process

Innovation:
To convenience our users, we strongly believe in precognition: actively providing suggestions or finishing tasks for users based on their app usage habits instead of passively responding to their input. We have adopted a hybrid model: data unique to each user, such as their daily routines and idiosyncrasies, is stored locally on their device, while data beneficial to all users such as those regarding typical structures and objects (walls, cars, keys) are stored in the cloud. Our self-developed algorithm processes the user’s surrounding environment obtained from the live feed of their smartphone camera, cross-references objects of interests from similar data stored in local and cloud storages, and produces suggestions for the user’s reference, or completes tasks for them in the background. Visually impaired people no longer have to scour their house for misplaced keys, nor worry about impending hazards just out of reach of their white canes: Lumino is able to anticipate their needs and provide context-contingent assistance, backed up by an extensive local and cloud database.

Functionality:
Lumino’s software has three main functions: object identification, hazard detection and navigation. All three main functions output feedback to the user via a virtual assistant in the same vein as well-established programs of the sort, such as Siri and Google Assistant.

As mentioned above, the user’s surrounding environment is captured by their smartphone camera. The user can tap different parts of the screen to learn about the relevant objects present in that quadrant (for example, tapping on the left half of the screen would reveal objects of interest to the left of the user). Our algorithm crawls local and cloud storages, taking into account the user’s daily habits, and reveals the object most relevant to the user’s needs. The software, by logging user behaviour, can also provide suggestions: a path to the dairy section will be suggested to a user who buys a carton of milk from the supermarket every Tuesday.

Object identification also extends to user protection by recognizing common obstructions such as walls, potholes and steps. Additionally, the user’s movements – such as their pace – are logged to accurately gauge the distance of any potential hazards from the user and issue timely warnings. This function is further supported by compact hardware: a forearm-length device with the potential to replace white canes. With an in-built three-axis motor, the turning force produces angular momentum, which acts as pseudo-propulsion to mimic a pulling sensation, which steers the user clear of any potential dangers in their path.

Lumino is also seamlessly integrated with contemporary web mapping services such as Google Maps and Apple Maps to provide its users with navigational assistance. An overlay is added on top of such mapping services to insert waypoints at turns and crossings of the suggested routes, and the virtual assistant notifies the user at such junctions. Smartphone haptic feedback also ensures the user remains on the route by vibrating until the user faces the correct direction.

Project result

Collegiate Computing Contest: Mobile Application Innovation Contest – Second Prize
Youth Innovation Award – Best Creative Idea Finalist

Student learning and achievements

Via participating in this competition/exhibition, the students in the group have gained the following knowledge and experience:

  • Booth hosting on an oversea occasion
  • Promotion of project information to the general public
  • Exchange thoughts and ideas with different field experts
  • Made contact with the various potential cooperation partner
Future plan

With all the valuable feedback from field specialists and professionals, we have a clearer image of what the future roadmap will be for project Lumino.

  1. Future enhancement of the context-aware algorithm
    Currently, the algorithm is still not optimized for certain situations including some common daily occurrences. We would expect to be able to refine and build a more solid, robust machine learning algorithm that can actively retrain itself at certain instances to allow a more inclusive experience for our users
  2. Future refinement of the hardware prototype
    Due to hardware limitations and the limitations for the processing power of raspberry pi, we weren’t able to achieve an excellent result with our hardware prototype. It has high latency issues and accuracy issues. As for our interface, we have yet to improve the medium to interface with our users. More user testing will be carried out to test out the prototype to decide whether a different approach is deemed a better solution.
Sharing

Wong Kwong Yat Felix

Coming up with this idea, we never thought we would come this far. All we wanted to do initially was to build a prototype that could solve the problem for the visually impaired. To solve a seemingly simple problem was spectacularly hard to our surprise. We gained successful results step by step along the journey. Though it was long and tedious, it was still worth it. This Innovation Showcase was the stepping stone to something bigger. It brought us to a bigger stage, both to learn and to show the world what we have done. The feedback and discussion with field specialists brought our attention to some details that were previously neglected. With all these new experiences, we are confident that we will be able to improve upon current work and bring forth a better product for the visually impaired in the future.

Wong Chi Ping Desmond

Being able to participate in this Innovation Showcase and Award is truly a new experience for me as well as a major milestone for project Lumino. It’s my first time to be able to attend such a large scale even as an exhibitor, with the chance to communicate and exchange ideas with different visitors, sharing with them our passion and vision, and receiving many positive feedbacks from them after learning about our project. It’s really encouraging and motivates us to work harder on completing this project and launching them for those who are looking forward to our project. Better still, we have received quite a number of feedback from professionals from different fields. With their valuable input, we now have a more concrete idea on how we should move on with our project, and we are sure that we will continue to work on this project with our utmost endeavour and make this into a real product and really be a guiding beacon for the visually impaired.

Dr. Vincent W.L. Tam

(Project supervisor, Department of Electrical and Electronic Engineering)

It is very exciting to see how our Engineering students: Desmond and Felix work hard to put their dreams into actions for helping those visually impaired during the whole process of design, implementation and testing for the Lumino project. I can still recall the first day when they came to me for discussing this “very initial idea” of developing a mobile app possibly with some sensory tool(s) for those visually impaired to navigate in the city after taking my course ELEC 3641 – Human Computer Interaction: Design & Programming. Since then, we hold regular meetings to “dream, explore & learn” how to turn it into “reality”. As in many successful Engineering projects, I can clearly see both Desmond and Felix have learned how to “persistently” FUEL their dream with energies and passion. Sometimes, we may “debate” on specific designs or system features. But after all, such ‘debates’ will always help to improve our overall design after considering all the pros and cons in “practice”. In the last December, the Lumino team won the best project award – ELEC3442 Embedded System at the 1st Engineering InnoShow.. After that, the team was luckily shortlisted as one of the teams to participate in the Youth Innovation Awards in the Singapore Digital Wonderland Exhibition, and also entered into the finalist. With the great supports by the Experiential Learning Fund of the HKUEAA and also the Tam Wing Fan Innovation Wing in the Faculty of Engineering, the team can have the valuable learning opportunity to discuss and interact with other international experts and student teams in the Singapore Digital Wonderland Exhibition so as to refine their design and prototype implementation of the Lumino project. More importantly, both students can learn to collaborate with students from other disciplines like business and laws to promote the project to the judging panel during the contest, thus helping the students to learn from each other and think from a truly inter-disciplinary manner as needed in many real-world Engineering applications nowadays.

Project poster
Project video
Project images
Media links
Awards

The best project award - ELEC3442 Embedded System @ The 1st Engineering InnoShow

This project team was selected for the best project award – ELEC3442 Embedded System at the 1st Engineering InnoShow.

Finalist of the Youth Innovation Awards in the Singapore Digital Wonderland Exhibition

The team Lumino entered to the finalist of the Youth Innovation Awards in the Singapore Digital Wonderland Exhibition organized by the InfoComm Media Development Authority of the Singapore Government.

It is the Singapore’s largest tech carnival held in the Singapore Suntec City Convention Centre with many cool gadgets and exciting activities lined up for everyone to discover, experience and innovate. The theme is to explore the latest technology to see what it is like to live in a digital future, and try the interactive experiences to discover how technology is positively changing the way we live, work and play.

The full article appears in the news of the Electrical and Electronic Engineering department website.

HKUEAA 44th Annual Dinner project presentation

Over 500 guests enjoyed a wonderful evening at the 44th HKUEAA Annual Dinner, themed “Innovation”, on 11 January 2020. The team Lumino is one of the project groups who has been invited to join the event and present their projects to the guests.

Project get approved by HKSTP The Science and Technology Entrepreneur Programme (STEP)

After gaining some invaluable international exposure by showcasing Lumino in the 2019 Singapore Youth Innovation Awards, the team resolved to take advantage of the resources offered to them by the HKU EEE Department and further develop Lumino as their final year project. Incorporating Lumino into their coursework proved to be the right choice, as multiple breakthroughs were made in the short span of the first semester. 

Launching Lumino as a startup was always the team’ s end goal, and by January 2020, the team was confident enough in the viability of their prototype to set their sights on the HKSTP STEP.  In March 2020, the project is chosen to get funded and receive guidance on fine-tuning ideas and technical development under this one-year startup support programme.

We would like to extend our sincere gratitude and appreciation for all of the support we received from our donors, which include: