Innovate@RGS
At Royal Global School, we believe in the power of Experiential Learning to shape well-rounded individuals. By engaging students in hands-on activities, real-world problem-solving, and collaborative projects, we ensure that learning extends beyond textbooks. This approach fosters critical thinking, creativity, and a deeper understanding of concepts as students connect theory with practice. Whether it’s through field trips, lab experiments, or community service, Experiential Learning at RGS empowers students to explore, innovate, and apply their knowledge in meaningful ways. The result is confident, curious learners who are prepared to navigate the complexities of the real world with competence and compassion.
EXPERIENTIAL LEARNING
EXPERIENTIAL LEARNING
At Royal Global School, we believe in the power of Experiential Learning to shape well-rounded individuals. By engaging students in hands-on activities, real-world problem-solving, and collaborative projects, we ensure that learning extends beyond textbooks. This approach fosters critical thinking, creativity, and a deeper understanding of concepts as students connect theory with practice. Whether it’s through field trips, lab experiments, or community service, Experiential Learning at RGS empowers students to explore, innovate, and apply their knowledge in meaningful ways. The result is confident, curious learners who are prepared to navigate the complexities of the real world with competence and compassion.
Local Speech-Based Conversational AI Stack (Whisper + Qwen + TTS)
Tagline: “Talk to your own offline assistant that never sends your voice to the cloud.”
Abstract: A local pipeline where speech is transcribed by Whisper (base model,cached), fed into a small local LLM (Qwen / Gemma), then spoken back using a natural-sounding TTS engine. You’ve defined strict preferences: cached models, no re-downloads, GPU if available, and wrapper functions for clean reuse. Thisbecomes the foundation for voice-controlled tools, tutors, and kiosks.
Plant Identifier App – “Pocket Botanist”
Tagline: “Point, click, and your phone whispers the plant’s full biography.”
Abstract: Concept + roadmap for a web/mobile app where users snap a plant photo and get species suggestions, care tips, and ecological notes. Built around image classification models (e.g., CNNs or pre-trained vision models) with a simple Next.js front end. Also planned: local caching and offline-first design for low-connectivity regions. A perfect project blending AI, ecology, and citizen science.
Electronic Voting Machine (EVM) Project
Tagline: “Democracy in a box: press, vote, see results instantly.”
Abstract: A working EVM IOT prototype that mimics real-world voting: buttons for candidates, a controller unit, and a results mode that counts and displays totals. Built using microcontroller logic (Arduino or similar), debouncing for safe button presses,
and clear UI. It’s both a civics demo and a digital logic lesson, ideal for school exhibitions and explaining secure counting.
City-Scale Tree Planting Optimizer – Root Recharge Radar
Tagline: “Plant one tree in the right spot, get ten times the recharge and shade.”
Idea: Used infiltration + NDVI + runoff layers GIS satellite map data to identify micro-sites (road edges, school campuses, medians) where planting specific tree species would:
a. reduce runoff,
b. increase shade on walking routes,
c. avoid root damage to drains.
Make a ranked “Tree Wish-List Map” for the municipality + schools to adopt.
Physics-Guided Finance Bot – Risk Radar for Retail Traders
Tagline: “Instead of predicting price, predict when not to be stupid.”
Idea: A trading bot + Monte Carlo + retail psychology integrated radar
a. Inputs: volatility, liquidity sweeps, time-of-day patterns.
b. Output: a “Risk Radar” score that tells you when not to enter, when to size down, when to stop trading. You can visually echo your runoff risk maps – heatmap of “stop-loss hunt zones” over the trading day.
Invisibility Cloak – OpenCV Demo & Exhibition Kit
Tagline: “The Harry Potter cloak, remixed with Python and a green cloth.”
Abstract: A re-built and modernized version of the classic OpenCV invisibility cloak project. Using color segmentation and background subtraction, it “erases” objects cloaked in a particular color from the live video feed. You’re updating dependencies,
virtual env, and platform specifics, then turning the tech into a layman-friendly exhibition with slides explaining how computer vision works under the hood.
Local Desktop LLM Companion (Gemma / Qwen + TTS)
Tagline: “Your own private Jarvis that lives on your laptop, not in the cloud.”
Abstract: A local AI assistant built from small open-weight models (Gemma 2B, Qwen 1.5B, etc.) integrated with a Python GUI (Tkinter or PyQt) and text-to-speech (pyttsx3 alternatives). The aim is to run chat, code help, and basic tools entirely offline, with a wrapper function to standardize model calls. It’s both a privacy-respecting assistant and a laboratory for experimenting with local inference,
caching, and multimodal add-ons.
Carbon Footprint Coach – Everyday Climate App
Tagline: “A tiny coach in your pocket that quietly fights climate change on your behalf.”
Abstract: A conceptual app that lets users log transport, food, energy usage, and purchases and then estimates their carbon footprint using simple models. The twist is AI-driven suggestions tailored to Indian lifestyles: buses vs autos, local foods, and realistic habit swaps. It’s designed for Next.js + an LLM backend (Gemini/others), aiming for minimal friction and high emotional engagement.
Real-Time Face Recognition Attendance System (Firebase + Python/OpenCV)
Tagline: “Students check in just by showing up – the wall does the roll call.”
Abstract: A face-recognition pipeline built around Python, OpenCV, and Firebase Realtime Database/Storage. The system stores face encodings for enrolled students, matches live webcam feeds, and writes attendance plus metadata straight into the database. It’s designed as a future-friendly backbone that can later plug into your classroom dashboards, security systems, or school ERPs.
Rain-Recharge Radar – Physics-Guided ML for Aquifer Rescue
Tagline: “Turning satellite pixels into groundwater wells, one recharge pit at a time.”
Abstract: A physics-guided machine learning system that identifies the most promising groundwater recharge spots in Guwahati’s urban watersheds. It fuses Sentinel-1/2, HLS, DEM, soil maps, and infiltration logic to compute infiltration potential, concavity, slope, built-up scores, and runoff risk. Output is ward-wise and pixel-wise GeoTIFFs/CSVs that tell planners exactly where a recharge pit or trench gives maximum benefit per rupee. Designed as a student-led, exhibition-ready project aligned with Environmental Engineering and ISEF-style formats.
Low-Cost Classroom AQI Monitor & Wellness Dashboard
Tagline: “Every classroom gets a tiny scientist that quietly watches the air children breathe.”
Abstract: An ESP-based, ultra-low-cost AQI monitor that measures PM1, PM2.5, PM10, CO, and NO₂ in real time for school classrooms. Weekly averages (like the 8-week PM/CO/NO₂ table you shared) are turned into simple traffic-light scores for teachers and parents. The ecosystem vision: dozens of nodes in classrooms, a central dashboard, alerts for unhealthy trends, and policy ideas on ventilation, masks, and activity timing so students can learn in safer air.
Deep Waste – AI Waste Identification App
Tagline: “Point your camera at trash and watch AI turn chaos into clean streams.”
Abstract: A mobile-first AI app concept where users snap a photo of any waste item and get instant classification: recyclable, organic, e-waste, hazardous, etc. It plugs into the smart bin ecosystem (below), reward systems, and local recycling centers. The project blends computer vision, a curated waste dataset, and simple UX so that even a child can sort complex waste correctly in seconds.
Smart Waste Segregator Bin (SnapSort / Clean Waste MVP)
Tagline: “A dustbin with a brain that refuses to let recyclables die in landfills.”
Abstract: An Arduino/ESP-based prototype bin that uses sensors and/or a connected AI classifier to direct items into the correct compartment. The hardware vision includes IR/proximity, weight sensors, and perhaps a camera linked to the Deep Waste app. The goal: automatic segregation at the point of disposal, with data dashboards for how much plastic, metal, and organic waste a school or locality generates.
Smart Tourist Safety Monitoring & Incident Response System
Tagline: “A guardian angel for tourists, powered by AI and a tamper-proof ID.”
Abstract: A concept system that issues digital tourist IDs (potentially blockchain-backed) and ties them to a network of cameras, panic buttons, and an incident-response app. It focuses on rapid verification, location tracing, and streamlined escalation to authorities. This project sits at the intersection of civic tech, safety, and AI-enabled surveillance – ideal for exhibitions or pilot proposals to local administration.