Graduate
Adversarial Active Learning
UAVs in Agriculture
Undergraduate
Design and Manufacturing
Self-Balancing Robot
Autonomous Maze Traversing Robot
RoboGoalie
|
Select a project to the left to find out more about a specific project I've worked on!
Collaborators: Chris Magnano
Abstract
Active learning allows models to choose training instances in an online setting. One unexplored domain where this comes into practice is in adversarial learning. As well as a number of new theoretical considerations, using active learning in adversarial domains may give valuable insight into how to design security systems to deter malicious users. Applying active learning in an adversarial setting results in a number of new considerations when designing a query method. We evaluate a number of query methods that may have value in the adversarial problem domain of network intrusion detection, and present a new method, k-Safe Axes. K-Safe Axes simulates a malicious user by attempting to find network attacks which could bypass a network intrusion detector. Our results are inconclusive, but show the need for further exploration of k-Safe Axes and other active learning query methods in this domain.
Check out the full paper here: Paper
Collaborators: Ganesh Kumar, Margaret Pearce
Abstract
Unmanned Aerial Vehicles (UAVs) are changing workflows in a variety of different fields ranging from security to cinematography. In particular, agriculture is a field that is poised to strongly benefit from this technology. In this paper, we use an ethnographic field study to gauge the current perception of UAVs in the farming community. A Grounded Theory analysis of interviews with stakeholders was used to identify UAV trends, impressions, and interest. Our findings revealed relatively low familiarity with UAV technology among farmers but significant interest in future possibilities. Key concerns included the applicability of UAVs and the potential return on investment. Furthermore, government regulation of UAVs is a critical factor that is slowing adoption as legislative decisions wait to be made. Despite these concerns, stakeholders are curious to see the future value of utilizing this technology. These results contribute to our understanding of the role of UAVs in agricultural settings and inform future design implications.
View our presentation poster here: Poster
Check out the full paper here: Paper
Collaborators: Max Brodsky, Henrietta Cho, Samuel Shrago
This project was a semester-long survey of a complete mechanical design and manufacturing product cycle. The end machine was designed to compete in a ball-collecting competition.
Milestones included initial design sketches, strategy development, and mechanical considerations. After a full design review, engineering drawings and fabrication instructions were developed, and finally the parts were manufactured using machining equipment including laser cutters, lathes, and mills.
Check out our final report here: Paper
Collaborators: Samuel Parrotte, Frank Tan, Ryan Wooster
Using computer engineering concepts discussed in lecture and practiced through laboratory exercises, we designed and fabricated a self-balancing robot.
Hardware programming was done in Verilog. Interfaces were created from an Actel Smartfusion embedded SoC to Xbee wireless communicators, continuous rotation servos, an accelerometer, a GameCube controller, and an LCD.
Software programming was done in C. Programs were written to take input from both the accelerometer and GameCube controller, and generate correct output to the servos. The Xbees were used to wirelessly communicate from a base station to the robot, and the LCD was used as a diagnostic and feedback display.
To effectively implement feedback control, the Ziegler-Nichols tuning method was used to determine PID parameters for the self-balancing mechanism.
Check out our presentation poster here: Poster
Collaborators: Garrett Dewald, Jim Rasche, Ryan Wooster
This project's goal was to implement the full pipeline required to autonomously navigate through a maze. The hardware used was a MAEbot, provided by the APRIL lab at Michigan. Computer vision techniques were used to process input from a forward-facing camera. Barrel distortion was automatically corrected for using a calibration image, and a mapping from our imaging plane to real world coordinates was created using perspective projection homography methods. Kalman filtering was used to update the robot's state estimate integrating both odometry information from wheel encoders and an inertial measurement unit. Once a map of the observable world was created, path planning was used to explore and navigate.
You can find more information about our automatic barrel distortion correction here: YouTube
Collaborators: Aaron Ridenour, Joe Scherping, Ryan Wooster
The purpose of the project was to create a proof of concept demonstration which would incorporate complex ideas from the EECS 467: Autonomous Robotics, and from the field, while also being approachable to a lay audience.
Using dynamic object tracking and position prediction, the system can accurately move the robotic arm to intercept the ball's trajectory.
Camera image processing was used to track the ball, and robotic kinematics was used to move the arm to the correct position.
Check out our technical video overview here: YouTube
|