Yu Zhong Last updated: 11/04/2014 - Computer Science

Yu Zhong ​
Last updated: 01/25/2015 415­244­7816, ​
[email protected]​
​
http://cs.rochester.edu/u/zyu Education University of Rochester 2011 ­ Present Ph.D. student in the Department of Computer Science (Advisor: ​
Jeffrey P. Bigham​
) Thesis Proposal: Enhancing Visual Information Access Techniques for Blind Users on Mobile Platforms Visiting the Human­Computer Interaction Institute (HCII) at Carnegie Mellon University from Sep, 2013 Tsinghua University, Beijing, China 2009­2011 (Top 5%) M.E. in Computer Science and Info Design Interdisciplinary Program (Advisor: ​
Yuanchun Shi​
) Thesis Title: Design and Implementation of Universal Monitor and Controller on Mobile for Smart Home Tsinghua University, Beijing, China 2005­2009 B.E. in Computer Science and Technology (major) B.A. in Digital Entertainment Design (minor) Publications 1. ​
Yu Zhong​
, Walter S. Lasecki, Erin Brady, Jeffrey P. Bigham. RegionSpeak: Quick Comprehensive Spatial Descriptions of Complex Images for Blind Users. ​
To Appear​
In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI 2015). 2. ​
Yu Zhong​
, T.V. Raman, Casey Burkhardt, Fadi Biadsy and Jeffrey P. Bigham. JustSpeak: Enabling Universal Voice Control on Android. In Proceedings of the 2014 international cross­disciplinary conference on Web accessibility (W4A). ACM, 2014. 1​
3. Walter S. Lasecki​
, ​
Yu Zhong1 ​
, Jeffrey P. Bigham. Increasing the Bandwidth of Crowdsourced Visual Question Answering to Better Support Blind Users. In Proceedings of the 16th ACM Conference on Computers and Accessibility (ASSETS 2014 ­ Poster). 4. ​
Yu Zhong​
, Pierre Garrigues, Jeffrey P. Bigham. Real­Time Object Scanning Using a Mobile Phone and Cloud­based Visual Search Engine. In Proceedings of the 15th ACM Conference on Computers and Accessibility (ASSETS 2013). 5. Walter S. Lasecki, Phyo Thiha, ​
Yu Zhong​
, Erin Brady, Jeffrey P. Bigham. Answering Visual Questions with Conversational Crowd Assistants. In Proceedings of the 15th ACM Conference on Computers and Accessibility (ASSETS 2013). 6. Erin Brady, ​
Yu Zhong​
, Meredith Ringel Morris, Jeffrey P. Bigham. Investigating the Appropriateness of Social Network Question Asking as a Resource for Blind Users. In Proceedings of the 16th ACM Conference on Computer Supported Cooperative Work and Social Computing (CSCW 2013). 7. Erin Brady, Meredith Ringel Morris, ​
Yu Zhong​
, Samuel C. White, Jeffrey P. Bigham. Visual Challenges in the Everyday Lives of Blind People. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI 2013). 8. ​
Yu Zhong​
, Phyo Thiha, Grant He, Walter S. Lasecki and Jeffrey P. Bigham. Using Real­time Feedback to Improve Visual Question Answering. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems Work­in­Progress (CHI 2012). 9. ​
Yu Zhong​
, Yue Suo, Wenchang Xu, Chun Yu, Xinwei Guo, Yuhang Zhao, Yuanchun Shi. Smart home on Smart Phone. In Adjunction Proceeding of ACM Ubicomp’11, Demo at Ubicomp’11. 1
Equal authorship, listed in alphabetical order. 10. ​
Yu Zhong​
, Xin Li, Mingming Fan, Yuanchun Shi. Doodle Space: Painting on A Public Display by Cam­Phone. In Adjunction Proceeding of ACM Multimedia 2009 (AMC '2009). 11. Mingming Fan, Xin Li, ​
Yu Zhong​
, Li Tian, Yuanchun Shi, Hao Wang. Surprise Grabber: a Co­located Tangible Social Game based on Phone Hand Gestures. In proceeding of the 14th ACM Conference on Computer Supported Cooperative Work and Social Computing (CSCW 2011). 12. Yue Suo, Chenjun Wu, Yongqiang Qin, Chun Yu, ​
Yu Zhong​
, Yuanchun Shi. HouseGenie: Universal Monitor and Controller of Networked Devices on Touchscreen Phone in Smart Home. In Proc. of 7th International Conference on Ubiquitous Intelligence and Computing (UIC 2010). (Best Demo Award) 13. Yue Suo, Yongqiang Qin, Chenjun Wu, Chun Yu, ​
Yu Zhong​
. Universal monitor and controller of Home Devices. In Proceedings of Harmonious Human Computer Environment 2010. (in Chinese) 14. Yuanchun Shi, Mingming Fan, Chun Yu, ​
Yu Zhong​
, etc. Painting in Public Doodle Space with Cam­Phone Brush. In Adjuction Proceeding of Ubicomp’09, Demo at Ubicomp'2009. Patent 1. ​
Yu Zhong​
, Pierre G., Benjamin J. Culpepper. 2014. U.S. Patent ​
13/768,051. ​
Real time object scanning using a mobile phone and cloud­based visual search engine, filed on 02/15/13 and issued on ​
08/2114. Work Experience
Software Engineer, Google Anticipated start date: May 2015 Anticipated team: Accessibility Engineering, Google Research Research Intern, Accessibility, Google Research May, 2014 ­ Aug, 2014 Manager: Casey Burkhardt, Software Engineer Independent project “Close Enough”: Precise Control of Android For Motor Impaired Users ● Reviewing literatures to investigate prior works on UI solutions for users with hand tremors. ● Proposed and implemented three main modules of a new Android app “Close Enough”: ○ Click­on­Lift: ​
Ignore unintentional touches and only consider finger lifting point for activation. ○ Enhanced Area Touch​
: Magnify single touch point to a circle of potential activation target. ○ Disambiguation​
: Confirm with magnified views when multiple targets are encapsulated. ● Ensured 90% above test coverage of Close Enough Implementation. ● Selected presentation (19/300+), including Close Enough, at Google PhD intern conference. ● Staffed accessibility sandbox at Google I/O 2014. ● Attended ​
Google Glass Accessibility workshop 2014​
. Research Intern, Accessibility, Google Research May, 2013 ­ Aug, 2013 Manager: T.V. Raman, Research Scientist Independent project “JustSpeak”: Enabling Universal Voice Control on Android ● Created a view indexer with accessibility APIs for JustSpeak to locate on­screen controls with text. ● Worked with the Speech team to enable command parsing in JustSpeak. ● Proposed and implemented the “chaining of commands” feature of JustSpeak. ● Tracked down and reproduced series of issues caused by framework bugs and fixed them. ● Monitored beta release cycle of JustSpeak and fixed usability issues reported, e.g. word matching. ● Presented JustSpeak at several internal events. ● Published JustSpeak as a communication paper at ​
W4A 2014​
. Research Assistant, University of Rochester Aug, 2011 ­ Present Worked on several accessibility related research projects within ​
ROCHCI​
lab: ● VizWiz​
: a mobile application that allows blind users to receive quick answers to questions about their surroundings. VizWiz combines automatic image processing, anonymous web workers, and members of the user's social network in order to collect fast and accurate answers. ● CrowdView: a system which enables blind users to effectively ask a sequence of visual questions to the crowd via real­time interaction from their mobile device by streaming feed from their camera. ● ScanSearch: a mobile application which helps blind users take photos without a button, empowered by an algorithm which can extract stable and high quality frames from continuous video stream. Increased success rate of an object recognition app by 30%, reduced average search time by 40%. ● RegionSpeak: an improved question answering service for blind people. The Q&A bandwidth is increased from both end­user and crowd worker sides. The end­user mobile app allows the use to scan the environment and capture a large panoramic picture which is sent to multiple crowd workers who each label a region of the image in parallel. Software Engineering Intern, IQ Engines Inc. Berkeley, CA (acquired by Yahoo!) May, 2012 – Aug, 2012 & Jan 2013 – May, 2013 Worked on developing real­time object recognition application on iOS and Android phones. Combined IQ Engines computer vision tools with image processing algorithms on mobile platforms to enable real­time scanning of virtually anything with built in camera on iPhone and Android phones. Research Assistant, Pervasive Computing Laboratory, Tsinghua University Sep, 2009 – July 2011 Worked with a team of senior graduate students to design and implement a home appliance controlling application on mobile phones. As a junior team member, gained experience and training of HCI research. Software Engineer, iHandysoft Inc., Beijing, China Oct, 2008 ­ Dec, 2009 Worked with a small start up team on a series of utility iPhone apps as the only software engineer, most apps now have more than 100,000 users and some made top 10 on iTunes, e.g. ​
iHandy Level​
. Skills
Java and Android development (expert) Objective­C and iOS development (expert) User interface design, study and analysis (proficient) Awards
Ubiquitous Intelligence and Computing Conference 2010, Best Demo Award, 2010 NOKIA Research Beijing Workshop, Students’ Design Award, 2010 Third grade award of Graduate Students, Tsinghua University, 2010 Apple Worldwide Developer Conference (WWDC) Students’ Scholarship of Apple Inc., 2009 Third grade scholarship of Tsinghua University, 2008 Academic Services Paper Reviewer, CHI WIP, 2015 Paper Reviewer, CHI, 2015 Paper Reviewer, Alt.CHI, 2015 Paper Reviewer, ASSETS, 2014 Student Volunteer, MobileHCI, 2012 Student Volunteer, UIST, 2011