Detailed Project Descriptions

SMUPal Chatbot.
#

SMUPal is an AI chatbot tailored to help students and the wider community at Saint Mary's University (SMU). Its primary goal is to provide accessible, clear, and empathetic assistance on various university-related topics, including campus resources, academic support, student life, and more.

SMUPal is trained on a small set of data that has been scrapped from Saint Mary's University website and SMUSA (Saint Mary's University Students' Association). This allows the chatbot to provide answers that are both detailed and relevant. The chatbot has been customized to recognize keywords, offer encouragement, and guide users to helpful resources, making it a valuable tool for both new and returning students. Additionally, SMUPal is designed to foster a welcoming environment, particularly for students who are new to SMU.

For proof of concept, SMUPal has been deployed on a static website generated using Jekyll. The chatbot's interface and full functionality is temporarily available at this link till it is permanently hosted on some server.

Features of SMUPal

Student-Oriented Design: Tailored to meet student needs, offering empathetic support for academic and emotional challenges.

Personalized Training for SMU: Customized with Saint Mary's University-specific information for relevant guidance.

Agent/University Connection: Can connect users to human agents or university services when needed.

Multilingual Support: Supports multiple languages (e.g., English, Chinese, Japanese) for inclusive access.

Watsonx-Formed Intelligence: Powered by IBM Watsonx Orchestrate for accurate, responsive AI assistance.

User-Friendly Interface: Intuitive design with buttons and text fields for easy navigation.

Emotional Support Options: Provides quick-access options for common student concerns like exam stress and feeling overwhelmed.

And more.

SMUPal is not yet fully trained on all SMU's data and doesn't actually know a lot of things at the moment. Below is a set of data it has trained on for demo purposes.

  1. SMUSA resources information.

  2. NEW To SMU program info.

  3. On-campus jobs info.

  4. Dining options on-campus.

  5. Residence resources.

  6. Comprehensive information of first year CS courses at Saint Mary's University.

  7. And a little bit more of general question-answering capabilities.

More Visuals and Information.
SMUPal Interface:

The chatbot’s design includes a clean, interactive interface with buttons for commonly asked questions or issues, as well as an open field for typing custom inquiries. SMUPal is intuitively easy to use for all students.

Guide On How To Use:

There is a step-by-step guide webpage to help students navigate and make the most of the SMUPal chatbot. This page also includes important information about any planned maintenance, so students will be aware of any temporary changes in SMUPal's responses or behavior.

Multilingual:

SMUPal supports multiple languages, including English, Chinese, Japanese, and more. This feature makes it accessible to a diverse student body by breaking down language barriers and fostering inclusion.

Ability To Connect To University Staff:

SMUPal can connect users to a human agent or to university support services if their queries require personalized human assistance. This feature ensures that students can escalate issues beyond the chatbot when needed.

Special thanks to

Yilin Huang & Muhammad Shaheer

who took part with me in the build and development of this chatbot that won a second place finish in IBM's WatsonX Education Challenge.

Pilot launch stats:

SMUPal's stats and student requests a week after its test launch.

Urban Pursuit Game.
#

Urban Pursuit is a 3D high action-adventure game where players assume the role of a determined ex-bounty hunter chasing a criminal mastermind through metropolitan landscape, featuring intense rooftop parkour chases. Players are to beat time in an attempt to catch the villain. Special shooting scene takes place at the end that could potentially enable the players to kill the villain.

The game unfolds in a sprawling metropolis filled with towering skyscrapers, bustling streets, and hidden alleyways, providing players with a dynamic and visually stunning environment to explore.

Gameplay

Intense rooftop parkour: Navigate through intricate obstacles, leaping between buildings, swinging from poles, and evading hazards.

Dynamic combat: Engage in shooting combat with the enemy, utilizing environmental elements to gain an edge.

General Features

• Seamless transition between rooftop parkour chases.

• Engaging shooting mechanics.

• Cinematic storytelling with compelling characters and emotional depth.

• Vibrant and dynamic city environment.

• Varied gameplay mechanics offering a diverse range of challenges and experiences and more.

Press download to get draft 1.0 of the design document for the game and any references.

Projects Phases
Game Idea & Design:

The game was developed to bring immersive gaming experiences that push boundaries and captivates players. The game is built on the Unity Engine and C# and is primarily made for Windows PC gamers. Game idea and design is handcrafted by storyboarding, logic diagrams etc.

Prototyping:

Demo scenes were made initially to prototype the idea. Several already made assets were downloaded from the Unity asset store and made modifications to them to align what we wanted.

Development:

The development stage of the game begin with making necessary prefabs for the city structure. and scripts to run the game. Some of them were already made prefabs from the asset store to ease the process. The process took 2-3 weeks length.

Playtesting:

The game was playtested so many times to asses the difficulty levels and fix any hidden bugs. Changes were made on the process. The game is also peer-reviewed by friends for feedbacks.

Final Updates, Documentation & Build:

Final updates, clean up was made up at this stage. The game was then build.

Special thanks to

Muttyeb Tahir & Bhanu Parkash

who took part with me in the development and build of this game from start to finish within such short timeframe!!!

Soccer Playing Robot.

This remotely operated vehicle was built using simplified C++, enabling it to navigate and complete tasks such as overcoming obstacles. The project began with the design phase, outlining the desired final appearance and functionality of the robot.

During the assembly process, most of the robot's parts were 3D printed, and certain components were soldered together to facilitate easy connections between different elements. The robot car is powered by two 3.7-volt batteries arranged in series, providing the necessary energy for its operations.

An integral feature of the robot is the in-built ESP32 chip, which enables both WiFi and Bluetooth integration for convenient control. This allows for seamless communication with the Dabble mobile app, serving as the interface for real-time control. The app ensures a user-friendly experience by interfacing directly with the ESP32 chip onboard, allowing for efficient and responsive control of the remotely operated vehicle.

Projects Phases
Assembly Phase:

Assembled all project parts as planned in the design process. Attached different parts together using screws and other tools.

Power Setup:

Implementing the power system by connecting two 3.7-volt batteries in series to supply the necessary energy for the robot car. This results in higher overall voltage, which is crucial for powering motors and other components that require a higher voltage input. Proper insulation and protection mechanisms are also put in place to safeguard both the batteries and the surrounding components from any potential issues related to the power system.

Software Development:
  • Develop the control software using simplified C++.

  • Integrate the Dabble mobile app as the user interface for real-time control.

  • Ensure compatibility and effective communication between the software and the in-built ESP32 chip.

Testing and Debugging:
  • Conduct thorough testing of the robot's functionality, addressing any issues in the hardware or software.

  • Debug and optimize the code for efficient performance.

Demonstration and Presentation:
  • Prepare a demonstration to showcase the capabilities of the remotely operated vehicle.

Deployment:
  • Prepared a demonstration to showcase the capabilities of the remotely operated vehicle.

#

Project supervisor: Mohamed Issa
Email:
info@eurekatec.ca

TrussVision AI.

TrussVision AI is designed to streamline the quality control process for wood truss manufacturing, making it more efficient and reliable. We integrated high-definition cameras, laser projections, and a Deep Learning model into a system to ensure every truss meets precise standards before it leaves the production line.

HD Cameras for Detailed Inspection

We have set up four high-definition cameras around the truss to capture every angle. These cameras work together to provide a comprehensive view of the truss’s alignment, structure, and connections:

  • Top Camera: Positioned above the truss, it captures the top view, inspecting the overall alignment and structure.

  • Bottom Camera: It is mounted beneath the truss and inspects the underside for hidden defects or issues with connection points.

  • Side Cameras: Two cameras are placed on opposite sides to analyze joint alignments and ensure there are no defects.

Laser Projection for Precise Alignment

To further enhance accuracy, we’ve integrated laser projection into the system. Using the Virtek TrussLine Laser Projection system. We project precise truss designs over each unit to verify its alignment and ensure that all components are placed correctly. The laser system also be adjusted dynamically when design changes are made.

AI Model for Defect Detection

The AI model plays a crucial role in identifying defects in the trusses. It is trained on SDNET2018 with over 56,000 images, the model can detect cracks as small as 0.06 mm and as large as 25 mm. With an accuracy rate of 89% with 15 epochs. The model analyzes the images captured by the cameras and classifies each truss as defective or non-defective based on the presence of any faults such as missing plates, misalignments, or gaps in joints. If in case of any defects, it alerts the production team and a manual inspection is done.

We used transfer learning approach with EfficientNet_B0 by freezing the base layer and training the classifier head.

At the time of writing, the model has been trained using the SDNET2018 dataset, which provides data for structural defect detection. Unfortunately, due to the lack of publicly available truss-specific datasets, this was chosen as a suitable alternative for the project's development.

Efficient and Reliable Inspection Process

The entire system works together to ensure thorough, reliable inspection in a fraction of the time it would take with manual methods. The high-definition cameras provide detailed images, the laser projection system ensures perfect alignment, and the AI model quickly identifies any issues, making the entire process automated and efficient. This solution helps reduce rework, ensures quality control across the production line, and ultimately leads to a more reliable truss product.

Our Experience with Payzant

This project was developed as part of Experience Ventures program with a placement at Payzant Company. We worked closely with their team and we were able to apply our theoretical knowledge to a real-world challenge.

Special thanks to

Andrew, Emmanuel, and Pesanth

who took part with me in the build and development of this TrussVision AI system.

Examples of Defective Trusses

Descriptions of the images

Image 1 – Example of a potential gap too large.

Image 2 – Plate too far one way.

Image 3 – Example truss joints with laser pre connector plate.

Image 4 – Example of trusses with plates pre first roller.

Image 5 – Example of plates pre roller from the side.

Image 6 – Example of plates pre roller.

Image 7 – Completed quality trusses.

Video – Example of how the trusses roll out post-production.

Smart Mirror With Banana Pi

To be updated soon.