Welcome Everyone!! I am Hasan Shahid Ferdous, a Lecturer in Extended Reality at Deakin University, Burwood Campus, Melbourne, Australia. previously, I served as a Research Fellow and Associate Lecturer at the Microsoft Centre for SocialNUI and Interaction Design Lab (IDL), School of Computing and Information Systems, the University of Melbourne, Australia. Currently I am working on various virtual, augmented, and mixed reality systems for different social contexts.
I have completed PhD research from the University of Melbourne focusing on social and collaborative use of technologies, particularly in the family mealtime context. I am also interested in touch, eye-gaze and other NUI interactions in the public or private social settings. I completed my Master of IT (Research) degree from Monash University, Australia and BSc (Engg.) from the Dept. of CSE, BUET, Bangladesh.
Recent Life Events
CHI 2019 Honorable Mention!!. Our paper "Santa's Lil Helper" receive the award for CHI 2019 Honourable Mention (Top 5% of the accepted full paper)! Hurrah...— 15th March, 2019
Two full papers in for CHI 2019 - "Augmented Body" and "Santa's Lil Helper!" See you all in Glasgow...— 11th December, 2018
Just presented our work in OzCHI 2018 "Social Media Question Asking (SMQA): Whom Do We Tag and Why?"— 5th December, 2018
“Our Workshop Proposal has been accepted for CHI 2018 Conference. Fingers crossed for visiting Montreal.”— 24th November, 2017
“Yeee, Received the MNSI Best Paper Award for Interdisciplinary research at the Doctoral Colloquium, School of Computing and Information Systems, University of Melbourne.”— 19th July, 2017
Submitted my PhD thesis. All these hard work for the last few years... now fingers crossed :) :)
— 30th May, 2017
Workshop proposal accepted for Interact 2017. Eating Your Own Data: Design At The Intersection Of Quantified Self And Digital Food Rohit Ashok Khot, Josh Andres, Hasan Shahid Ferdous, Jaz Hee-Jeong Choi, and Florian ‘Floyd’ Mueller
— 6th March, 2017
What a wonderful morning! Just received letter that our paper ‘Table Manners’: Children’s Use of Mobile Technologies in Family-friendly Restaurants is accepted as a CHI 2017 Case Study paper.
Wow... Now we have three papers this CHI (ToCHI journal, full paper, case study paper) :) :) :)— 11th Ferbruary, 2017
Absolutely delighted to have our paper "Celebratory Technology to Orchestrate the Sharing of Devices and Stories during Family Mealtimes" accepted for CHI 2017.
Wow... Now I will present two papers in this CHI :) :) Denver, here I come ...— 12th December, 2016
“Received the Google PhD Travel Scholarship to present my paper in UbiComp 2016 at heidelberg, Germany.”— 17th October, 2016
“Just received notification that our journal paper "Commensality and the Social Use of Technology during Family Mealtime" got accepted in ACM Transactions on Computer-Human Interaction (TOCHI), one of the very topmost venues for HCI publications.”— 30th August, 2016
“Received the Google Australia Best Paper Award at the Doctoral Colloquium, School of Computing and Information Systems, University of Melbourne.”— 13th July, 2016
“Our long paper titled "TableTalk: Integrating Personal Devices and Content for Commensal Experiences at the Family Dinner Table" got accepted in UbiComp 2016. Germany, here I come :) ”— 5th June, 2016
“Resumed studies after my around the world in 127 days tour :) :) ”— 19th January - 24th May, 2016
“Presented Paper in CHI 2016 Workshop” — 7th May, 2016
“Visiting Research Student @ Microsoft Research, Cambridge. ”— 11 - 15 April, 2016
“Study Away at the Open Lab Newcastle University, UK. ”— 21st January - 27th April, 2016
Everyday interactions with technologies (e.g., TV, mobile phones, etc.) around family mealtimes are often criticised. But can there be any positives too?
We worked in collaboration with the Melbourne Science Gallery to investigate the use of "On Body Projections" in public settings.
Chorus transforms personal devices into a communal shared display on the table to enrich mealtime interactions and experience.
The Augmented Studio platform uses projection mapping to turn a human body into an interactive canvas in real-time.