
Varden Labs
Varden Labs was founded by myself and my friend/ engineering classmate at the beginning of 2015 while we were in our second year of mechatronics engineering at the University of Waterloo. What first started as a summer project quickly became a company with the intent of commercializing low speed transportation shuttles requiring no driver to be used on privately owned campuses such as universities.
​
We began working on the technology full time in May of 2015. Within a month, we had converted a used golf cart into a vehicle that could follow a fixed, prerecorded path/route with nobody in the vehicle and slow down/stop for obstacles that were on the vehicles intended trajectory. The vehicle was featured by numerous news organizations including the discovery channel and Canadian national news. It also became the first self-driving vehicle on Canadian roads when it 100% autonomously drove a lap around the University of Waterloo's "ring road" in July of 2015.
​
Following a summer of development, we applied and were accepted into the winter 2016 (W16) batch of startups in Y Combinator, a startup incubator in San Fransisco. We brought 2 more of our friends onboard and moved to California.
​
In the next 4 months, we improved upon our technology and built a second generation vehicle on a more polished electric vehicle platform that allowed us to scale our prototype vehicles for software testing and development. We also toured many private campuses around California demonstrating our vehicle in these different environments.
Immediately following the 3 month program, we raised a seed round of capital valued at USD 2 Million. It was around this time however that we began to notice the unclaimed potential market of self-driving commercial semi trucks and decided to pivot the company in this new direction at the end of April 2016. We renamed the company to Embark and set ourselves to work.
​
First generation prototype
"Marvin", as we called him, was the first prototype self-driving vehicle that Varden produced. The vehicle was a combined effort between my co-founder and I and was developed over only 2 months.
Originally a used golf cart, we outfitted the vehicle with custom made actuators/ servos for control of both the steering and brake functions, tapped into the existing system to control throttle and had an embedded controller to interface with the actuators and corresponding sensors.
It was also outfitted with a high end laptop computer running ROS (robot
operating system) for all high level software development. The main laptop received data from the two main sensors, a NovAtel GPS and a Velodyne VLP-16 Lidar. The GPS unit that we used was able to globally localize the vehicle to ±2 cm within one standard deviation in open sky. With this, we were able to record GPS waypoint paths by driving a route once manually and then self-drive down the same path using the GPS for localization and a geometry based steering algorithm (pure pursuit) to follow the path. With the Velodyne Lidar, we used a locally framed occupancy grid to determine where objects were relative to the vehicle in each time instance, we then overlaid this grid on our path and intended trajectory to determine if objects were on our intended path and how close they would be as we passed them. This data was then used to slow the vehicle if objects were close to the intended path, and stop if they were on it. This simple approach, to our surprise, was more than enough to handle 99% of non adversarial situations at the speeds that we were driving at (up to 16 km/h).
Second generation prototype
Our second generation vehicle marked the transition from a summer project to a potentially marketable solution. Like before, the development of this vehicle was a combined effort of primarily myself and our third co-founder who joined after the development of the first vehicle.
For the base vehicle, we chose to retrofit a Polaris Gem electric shuttle which offered the ability to drive at speeds up to 35 km/h (roughly the maximum desired speed of our end product) while being small enough to easily develop on. The product line also featured different length vehicle which was ideal for testing and potentially using for a first generation product.
​
Much of the initial development of this second vehicle followed a similar trend to before. Custom made actuators were needed to control the vehicle by the computer system, these systems were designed by me and have their own detailed writeup that can be found in one of the links at the bottom of this page. In addition, a more detailed writeup of the computers/ embedded system as it was developed by me can be found in one of the links at the bottom of the page. Generally put however, a ruggedized in-vehicle computer running ROS was used for all high level software which for the first revision, was also used for lower level actuator controls. Once the vehicle was as capable as the first prototype, I began transitioning to a custom made board with various microprocessors and safety verification which preformed all vehicle interfacing.
Like the first prototype, we outfitted the vehicle with a single Velodyne VLP-16 Lidar which had a full view all around the vehicle. Unlike before however, it was clear that the unit economics didn't work out if each vehicle required a high end GPS to operate. In our first prototype, vehicle localization was solely done using a high end NovAtel GPS, not only would this have been too expensive, but this was also not a reliable solution for a product intended to operate in non ideal environments.
​
At first localization on this second vehicle was achieved simply with the GPS, but over the course of development, prior to pivoting the company, we were developing a localization strategy using the lidar and high resolution 3D maps. A more detailed writeup of my own work in this area can be found in the links below. The plan was to map campuses with a vehicle equip with a removable GPS system. The GPS was used to supplement a basic SLAM (simultaneous localization and mapping) system composed of an ICP (iterative closest point) scan registration process between subsequent scans as well as to a global map which was used to propagate a particle filter forward in time. The end process would build a high resolution 3D map of the campus offline which could be used with an online particle filter localization system and a cheeper GPS. By the time that we pivoted the company, we were able to create 3D maps that were good enough to localize on but didn't end up testing our localization online outside of a simulator.
Related Project Links
I was the Engineering Lead of the University of Waterloo's Formula Electric team (FSAE). I lead the engineering efforts of the 2018 vehicle, worked toward establishing a new team culture, and picked up on any project that needed extra hands.
I tested robot localization algorithms in ROS (robot operating system) including particle filter based localization using a Kalman filter on state change and LiDAR based scan matching using ICP for prediction and the ICP fitness score for resampling
As part of a team of 4, we are developing an underwater vehicle to travers an obstacle course in the fastest time possible. My primary roll is to develop the GNC (guidance, navigation and controls) stack running on an onboard android phone
When learning more about modern control theory, I wanted to try a more challenging project that I had wanted to design for a while. Using Simulink, I designed a simulation along with an LQR controller to balance a double pendulum even when subjected to noise.