• Skip to main content
  • Skip to primary sidebar

Human-System Interaction (HSI) Lab

Texas A&M University College of Engineering
  • Home
  • Projects
  • News
  • Publications
  • People
  • Outreach & Education
  • Opportunities
  • Contact Us

Uncategorized

Dr. Zahabi has been awarded the NSF CAREER award to study Adaptive Driver Assistance Systems and Personalized Training for Law Enforcement Officers.

Posted on March 25, 2021 by jpark

(Mar. 2021.) Dr. Zahabi has been awarded the NSF CAREER award to study Adaptive Driver Assistance Systems and Personalized Training for Law Enforcement Officers. For more information regarding this award, please see https://www.nsf.gov/awardsearch/showAward?AWD_ID=2041889&HistoricalAwards=false

Filed Under: Uncategorized

HSI lab in collaboration with Wichita State University has been awarded an NSF grant to develop seamless and inclusive location-based services for communities

Posted on July 11, 2020 by jpark

Award Abstract #1951864

 SCC-IRG Track 2: CityGuide: Seamless and Inclusive Location-Based Services for Communities

NSF Org: CNS
Division Of Computer and Network Systems
divider line
Initial Amendment Date: July 10, 2020
divider line
Latest Amendment Date: July 10, 2020
divider line
Award Number: 1951864
divider line
Award Instrument: Standard Grant
divider line
Program Manager: Linda Bushnell
CNS Division Of Computer and Network Systems
CSE Direct For Computer & Info Scie & Enginr
divider line
Start Date: October 1, 2020
divider line
End Date: September 30, 2023 (Estimated)
divider line
Awarded Amount to Date: $1,122,749.00
divider line
Investigator(s): Vinod Namboodiri vinod.namboodiri@wichita.edu (Principal Investigator)
Nils Hakansson (Co-Principal Investigator)
Siny Joseph (Co-Principal Investigator)
Maryam Zahabi (Co-Principal Investigator)
Guler Arsal (Co-Principal Investigator)
divider line
Sponsor: Wichita State University
1845 Fairmount
Wichita, KS 67260-0007 (316)978-3285
divider line
NSF Program(s): S&CC: Smart & Connected Commun
divider line
Program Reference Code(s): 042Z, 9150
divider line
Program Element Code(s): 033Y

 

ABSTRACT

Persons with disabilities in our communities often find it difficult to achieve and maintain an independent and high quality of life. A significant cause of this issue is the challenge in independently accessing locations and services within the community. In spite of advances in global positioning system (GPS)-enabled applications, there are many daily-life scenarios where the lack of adequate location-based services presents mobility and access challenges to persons with disabilities. The long-term vision for this project is to design, deploy, evaluate, and refine an inclusive community-wide system (accessed through a smartphone app) called CityGuide that provides various auxiliary location-based services (ALBSs) for people with disabilities (and the general population), complementing satellite-based GPS systems. CityGuide will provide a core wayfinding application as a service with the twin capabilities of exploration and navigation. Building upon this core wayfinding service, numerous other applications can be built; some specific applications of interest within the scope of the project are: emergency evacuation, remote assistance, and transit.

This project attempts to use a common technology infrastructure to simultaneously serve the auxiliary wayfinding needs of people with a broad range of disabilities. The top-level goal of using technology to improve accessibility in communities naturally requires integrative socio-technical research contributions that advances knowledge on multiple fronts. From a technology design perspective, the project advances knowledge about providing seamless and scalable indoor and outdoor location-based services. From an information design perspective, the project advances knowledge about contextually appropriate cues and content for a variety of location-based applications. From a human-computer interaction perspective, the project advances knowledge in applying universal design principles towards accessing location-based services. From an economic analysis perspective, the project advances knowledge about the impact of economies of scope and scale in the feasibility and sustainability of accessibility technologies in small to medium-sized communities. Expected societal impacts from the project include the development of wayfinding technologies (and associated tech transfer) that provides people with disabilities and also the general population a useful tool to increase their independence, and thus, quality of life; and creation of a model for other similar future efforts (beyond wayfinding) to address the need for greater inclusivity in how various community-based services are accessed.

This award reflects NSF’s statutory mission and has been deemed worthy of support through evaluation using the Foundation’s intellectual merit and broader impacts review criteria.

 

https://www.nsf.gov/awardsearch/showAward?AWD_ID=1951864&HistoricalAwards=false

Filed Under: Uncategorized

HSI lab has been awarded a grant from Safety through Disruption (Safe-D) University Transportation Center (UTC) to study the advanced driver-assistance systems (ADAS) in police vehicles

Posted on May 18, 2020 by jpark

Title

ANALYSIS OF ADVANCED DRIVER-ASSISTANCE SYSTEMS IN POLICE VEHICLES

Abstract

Motor vehicle crashes are the leading cause of deaths for police officers. These crashes have been mainly attributed to the use of in-vehicle technologies while driving. Advanced driver-assistance systems (ADAS) have the potential to improve officer safety by removing some of driver vehicle control responsibilities. Although current ADAS in police vehicles can adapt to emergencies and provide multi-modal alerts, there has been little research on how ADAS can reduce driving task demands in situations that officers are also engaged in secondary-tasks while driving. The objective of this project is to evaluate ADAS in police vehicles. This project will investigate how ADAS features should adapt in situations of multi-tasking and what types of ADAS are most effective for improving driver safety. This project includes two phases including (1) ADAS needs and implementation analysis in police vehicles; and (2) evaluation of police ADAS in a driving simulation study. The first phase includes ride-along observations and focus group meetings with officers to understand their ADAS needs and current systems in police vehicles. The second phase will evaluate ADAS in high-demand situations using a high-fidelity driving simulator. Fifty (50) police officers will be recruited through our collaboration with Texas A&M Engineering Extension Services (TEEX). The outcomes will provide practical guidelines to automotive companies supplying police vehicles regarding effective ADAS features/types and can improve officer safety in police operations. This project addresses safety in the primary area of automated vehicles and secondary application areas of vehicle technology, planning for safety, and driver factors and interfaces.

Research Investigators (PI*)

Maryam Zahabi (TTI/TAMU)*
Farzaneh Shahini (TTI/TAMU-Graduate Student)
David Wonzniak (TTI/TAMU-Student)
Vanessa Nasr (TTI/TAMU-Student)

Project Information

Start Date: 2020-05-01
End Date: 2022-03-31
Status: Active
Grant Number: 69A3551747115
Total Funding: $199,970
Source Organization: Safe-D National UTC
Project Number: TTI-05-02

Safe-D Theme Areas

Connected Vehicles

Safe-D Application Areas

Driver Factors and Interfaces

More Information

RiP URL
UTC Project Information Form

Sponsor Organization

Office of the Assistant Secretary for Research and Technology
University Transportation Centers Program
Department of Transportation
Washington, DC 20590 United States

Performing Organization

Texas A&M University
Texas A&M Transportation Institute
3135 TAMU
College Station, Texas 77843-3135
USA

 

Analysis of Advanced Driver-Assistance Systems in Police Vehicles

Filed Under: Uncategorized

HSI lab’s collaborated research with TTI and NeuroErgonomics lab was featured on the Engineering News website and on EurekAlert today

Posted on April 23, 2020 by jpark

Study shows senior drivers prefer watching videos to learn driver assistance technologies

Student with hand on a steering wheel, showing Aggie ring. Videos teaching senior drivers how to use advanced driver-assistance systems, ADAS, can help seniors drive safer and avoid dangerous crashes. | Image: Texas A&M Engineering

Most vehicles today come with their fair share of bells and whistles, ranging from adaptive cruise-control features to back-up cameras. These advanced driver-assistance systems, or ADAS, are in place to make driving easier and safer. However, increasing evidence shows that older seniors, who are also an age group at higher risk for motor vehicle crashes, do not use many of these driver-assistance technologies.

In a new study, research partners from the Texas A&M Transportation Institute and Texas A&M University have found that older adults are likely to use ADAS if they are taught how to use these technologies through interactive videos rather than through manuals or live demonstrations. They also reported that once ADAS-trained, older adults find it easier to access and use driver-assistance technologies without compromising their attention on the road.

“Older adults have a higher rate of vehicle crashes because of degradations in physical, mental and motor capabilities,” said Dr. Maryam Zahabi, assistant professor in the Department of Industrial and Systems Engineering and director of the human-system interaction (HSI) laboratory. “With ADAS, some of the mental workload related to driving can be taken off, and we’ve shown that instructional videos are the best way to introduce ADAS to seniors. We hope that this insight will lead to better video-based training materials for this age group so that senior safety while driving is enhanced.”

Their findings were published in the January issue of the journal Applied Ergonomics.

According to the National Highway Traffic Safety Administration, in 2016, 18% of all motor vehicle crashes involved people 65 years and older. With the population of seniors expected to increase in the decades to come, the number of people vulnerable to vehicle crashes is also estimated to increase proportionately.

“Think of the risk for motor crashes as a U-shaped curve,” said Zahabi. “Following the shape of the letter ‘U’, the chances of crashes among younger adults and teens is very high. Then with age, the risk for crashes lowers and remains at a small, relatively constant value until about 60 years, after which it shoots up once again.”

Risk of a vehicle crash among seniors is largely related to the fact that they find it difficult to perform multiple activities while driving, for example, starting the adaptive cruise control while still paying attention to the road and looking up to see what is the acceptable speed limit. While ADAS is designed to relieve some of the driving-related tasks, these technologies need to be introduced to seniors in a manner that is conducive to learning at their age, said Zahabi.

“Following the shape of the letter ‘U’, the chances of crashes among younger adults and teens is very high. Then with age, the risk for crashes lowers and remains at a small, relatively constant value until about 60 years, after which it shoots up once again.”

Dr. Maryam Zahabi

Ashley Shortz, a graduate student researcher from the NeuroErgonomics Laboratory at Texas A&M, narrowed down four main ways to provide ADAS instructions — manuals, videos, driving simulators and live demonstrations from an instructor, based on prior research and existing training best practices. However, little is known about which one of these methods best fit seniors.

“More importantly, while there is substantial evidence that men and women adopt different learning strategies, research on ADAS design and training delivery methods have largely overlooked such gender differences,” said Dr. Ranjana Mehta, associate professor in the Department of Industrial and Systems Engineering and director of the NeuroErgonomics Laboratory.

To address this, the researchers included 10 male and 10 female drivers, ages 58-68 for their study. For this age group, the team concentrated on video-based and demonstration-based ADAS training rather than manuals or driving simulators. Their choice was guided by prior studies showing that drivers don’t read detailed instructions from manuals or have easy access to driving simulators.

After receiving training for either adaptive cruise control or the lane-keeping assist system, which are both popular ADAS technologies, the participants’ driving performance was evaluated in a laboratory-housed driving simulator that provided an immersive experience of driving along a roadway.

Then, while the drivers switched between ADAS and manual control, the researchers kept track of where the drivers directed their gaze and the activity in the part of the brain that regulates attention and mental workload, among other things.

The team found that for both male and female drivers, video-based training was more effective than demonstration-based training for introducing ADAS technologies to seniors. However, the researchers also found some subtle gender differences.

“We were surprised to find that while male drivers were faster at activating ADAS, they were also the most distracted by it,” said Zahabi. “So, from a neurological standpoint, older female drivers were more efficient at using ADAS technologies and reducing their mental workload after video-based training.”

“This finding is important as it not only emphasizes how training methods impact different groups of people, but also provides the foundation to develop more equitable, and thus more effective, training paradigms” said Mehta.

The researchers noted that more comprehensive studies involving a larger number of older adults, a broader age range of participants and a wider option of driving scenarios still need to be done. They said that these studies might shed light on other gender-based differences that may have not been uncovered in their present study.

But even if preliminary, Zahabi said that their results still indicate why videos work best for teaching ADAS to seniors.

“This finding is important as it not only emphasizes how training methods impact different groups of people, but also provides the foundation to develop more equitable, and thus more effective, training paradigms.”

Dr. Ranjana Mehta

“Videos, we think, are effective because they can be paused, rewound and reviewed multiple times, giving seniors a sense of control over what they are learning and at what pace,” said Zahabi. “Our work does not diminish the importance of manuals and other forms of instructional materials, instead our results challenge the way we normally think about communicating ADAS technology-related information to seniors.”

The results of their work have important real-world implications. “These results and others from the project have already been shared with driver education and training agencies throughout the United States and abroad to aid in the design of curriculum for all ages. This was a great opportunity for work conducted at Texas A&M to impact driver safety,” said Dr. Michael Manser from the Texas A&M Transportation Institute.

Another contributor to the research includes Ashiq Mohammed Abdul Razak from the Department of Industrial and Systems Engineering.

<Reference>

https://engineering.tamu.edu/news/2020/04/study-shows-senior-drivers-prefer-watching-videos-to-learn-driver-assistance-technologies.html

Filed Under: Uncategorized

Dr. Zahabi was featured in Medical Design Briefs as one of the leading women in engineering and science

Posted on March 30, 2020 by jpark

What led you to choose science and/or engineering as a career, particularly in the medical device field?

Zahabi: I always had a passion to understand human behavior and how to improve the performance of human systems. Therefore, I decided to choose industrial and systems engineering as my major and future career with a focus on human-systems engineering. I am especially interested in applications of human-systems engineering in healthcare and rehabilitation domain to enhance the quality of life for specific population.

What has been your most rewarding moment/accomplishment as an engineer/scientist in the medical field?

Zahabi: The most rewarding moment for me as an instructor is when I see my students applying the knowledge their learned in their courses to make the world a better place for everyone. I am very proud of their success and am grateful to have an impact on their journey.

As a researcher in health and human systems engineering area, the most rewarding moments are when I see the outcomes of our research can help people and improve their quality of life. For example, our research in electronic medical records led to a set of design guidelines to improve the usability of these devices that can ultimately reduce documentation errors and improve patient safety. In addition, our studies in upper-limb prosthetic devices have led to understanding of mental workload in using these devices and identification of algorithms that can reduce mental workload of prosthetic users and improve their performance in activities of daily living. Seeing these outcomes and their potential impacts are the most rewarding moments of my career as a scientist.

What advice would you give to other women looking to work in biomedical engineering and science?

Zahabi: Human (or the user) should be the center of all designs. Understanding human capabilities and limitations is the first step for a successful human-system interaction. Each individual has different needs and preferences and the best engineering solutions are those that are customized to match individual user needs. To all women in engineering and science: follow your passion and be confident. Trust what you do and don’t hesitate to ask questions.

Original Post: https://www.medicaldesignbriefs.com/component/content/article/mdb/features/articles/36311?m=854

Filed Under: Uncategorized

HSI lab research was featured in KTVA news

Posted on February 7, 2020 by jpark

https://hsi.engr.tamu.edu/wp-content/uploads/sites/200/2020/02/HSI-lab-research-was-featured-in-KTVA-news.mp4

Modern technology has enabled upper limb amputees to keep pace in a fast paced society. But there are still challenges.

Researchers at three universities are working to make those daily tasks for amputees simpler too — not only physically, but mentally as well.

“To understand the cognitive load, or mental workload when prosthetic users use different types of upper limb prosthetic devices,” said assistant professor in the Department of Industrial and Systems Engineering at Texas A&M University Maryam Zahabi, Ph.D. “So the issue is that most of these prosthetic devices are actually very hard to use and very challenging.”

The tools used in the research? Machine learning algorithms and computer models. Driving simulations and virtual reality also play a role.

Electromyography, which is also being looked at, records electrical activity happening in the muscles. Those signals in turn help decipher them into commands.

The funding for the project comes from the National Science Foundation. Texas A&M University is working with North Carolina State University and the University of Florida.

Filed Under: Uncategorized

Texas A&M researchers working to improve prosthetic devices

Posted on February 1, 2020 by jpark

https://www.kbtx.com/content/news/Texas-AM-researchers-working-to-improve-prosthetic-devices-567475171.html?jwsource=cl

COLLEGE STATION, Tex. (KBTX) – Texas A&M is using science and technology to improve the use of prosthetics.

Researchers are using virtual reality as well as a new driving simulator inside the Emerging Technologies Building.

Texas A&M Assistant Professor Maryam Zahabi and her team are studying computer models and machine learning algorithms. The scientists are also using the simulators to study tasks most of us find as very simple. They hope to give more guidance to engineers designing new prosthetic devices.

“So the objective of this study is to understand the cognitive load, or mental workload when prosthetic users use different types of upper limb prosthetic device. So the issue is that most of these prosthetic devices are actually very hard to use and very challenging,” said Zahabi, Ph.D.

Their work on improving prosthetics is being funded by the National Science Foundation. Texas A&M is partnering with North Carolina State University and the University of Florida for the study.

Filed Under: Uncategorized

3 things you need to know about the new driving simulator

Posted on January 31, 2020 by jpark

View inside the driving simulator.
The new driving simulator will support vehicle technologies research. | Image: Justin Baetge/Texas A&M Engineering

The Department of Industrial and Systems Engineering at Texas A&M University installed a new driving simulator to use in research pertaining to driving, autonomous vehicles and other vehicle technologies.

Here are the top three things you need to know about the simulator, including what types of research is currently being done and future areas of research that will help increase safety on the road, including a future with self-driving vehicles.

Dr. Banks drives in a driving simulator.
Dr. M. Katherine Banks, dean of the College of Engineering, test drives the driving simulator. | Image: Justin Baetge/Texas A&M Engineering
1. A one-of-a-kind feature
The state-of-the-art simulator features a 270-degree field of vision, which provides a realistic driving experience for the user. Field of vision is the area you can see on each side, your peripheral vision, while you look straight ahead. This is a very rare field of vision for a simulator and there are only a few simulators with this capability in the United States.

Two students driving in the driving simulator.

The driving simulator can be manually driven or run autonomously. | Image: Dharmesh Patel/Texas A&M Engineering

The driving simulator can also be run autonomously, allowing researchers to conduct experiments about self-driving vehicles and how users react to these vehicles. As self-driving vehicles become more prevalent, this simulator will be helpful to researchers working to understand how self-driving vehicles will impact safety and traffic.

2. Current research projects

The Human Factors and Machine Learning Laboratory is using the simulator for research on autonomous vehicles and cyclist safety, in partnership with researchers in the Department of Landscape Architecture and Department of Psychological and Brain Sciences. This research looks at how bias may play a role in cyclist interactions with vehicles. Realistic driving scenarios were created for the simulator that allowed the researchers to measure the impact of bias on driver and cyclist interactions. This work was funded by the T3 grants awarded through the Office of the President.

Dean Banks and others stand looking into driving simulator.

Banks (left) speaks to Dr. Maryam Zahabi (center) and Dr. Tony McDonald (right) about their research with the simulator. | Image: Justin Baetge/Texas A&M Engineering

The Human-System Interaction Laboratory uses the simulator for research on emergency responders, specifically law enforcement. The simulator will be used to determine the effects of in-vehicle technologies, such as laptop computers,  driver fatigue and vehicle autonomy on the emergency responder’s performance. These results will then be used to develop in-vehicle systems that adapt to emergency responders and training programs that will improve emergency responder safety on the job. This work is funded by the North Carolina Occupational Safety and Health Education and Research Center.

3. Interdisciplinary teamwork
Many different types of research can be done on the driving simulator including research on drowsy drivers, autonomous vehicles, on-road sign evaluation, driving education, driver behavior and much more.

The driving simulator is available to researchers interested in these areas of research or others.

Student explains features of a driving simulator.

Researchers from across the university can collaborate using the driving simulator. | Image: Justin Baetge/Texas A&M Engineering

 

 

Filed Under: Uncategorized

New algorithms improve prosthetics for upper limb amputees

Posted on January 15, 2020 by jpark

https://engineering.tamu.edu/news/2020/01/new-algorithms-improve-prosthetics.html

Dr. Maryam Zahabi points to a computer screen.

New prosthetic interfaces will be tested through virtual reality and driving simulations. | Image: Dharmesh Patel/Texas A&M Engineering

Reaching for something on the top shelf in the grocery store or brushing one’s teeth before bed are tasks many people can do without thinking. But doing these same tasks as an upper limb amputee, while using a prosthetic device, can require more mental effort.

Dr. Maryam Zahabi, assistant professor in the Department of Industrial and Systems Engineering at Texas A&M University, and her team are studying machine learning algorithms and computational models to provide insight into the mental demand placed on individuals using prosthetics. These models will improve the current interface in these prosthetic devices.

The researchers are studying prosthetics that use an electromyography-based human-machine interface. Electromyography (EMG) is a technique which records the electrical activity in muscles. This electrical activity generates signals that trigger the interface, which translates them into a unique pattern of commands. These commands allow the user to move their prosthetic device.

Unfortunately, using such prosthetics can be mentally draining for upper limb amputees – even for accomplishing simple, daily tasks like operating a toothbrush.

“There are over 100,000 people with upper limb amputations in the United States,” Zahabi said.  “Currently there is very little guidance on which features in EMG-based human-machine interfaces are helpful in reducing the cognitive load of patients while performing different tasks.”

Individual uses an upper arm prosthetic to hold a pen.

Over half of upper limb amputees abandon prosthetic devices due to frustration during use. | Image: Getty Images

Testing different interface prototypes, through virtual reality and driving simulations, will allow researchers to provide guidance to the engineers creating these interfaces. This will lead to better prosthetics for amputees and other technological advances using EMG-based assistive human-machine interfaces.

This research is a collaboration between Texas A&M, North Carolina State University and The University of Florida and is supported by the National Science Foundation.

Filed Under: Uncategorized

Welcome to Human-System Interaction (HSI) Lab!

Posted on August 29, 2019 by jpark

We are focused on understanding human behavior and abilities in complex systems and using Human-Systems Engineering methods to improve human performance, design of tools, machines, tasks, and systems, and ultimately to enhance the quality of life!.

Please visits our current Projects to find out more about HSI lab and see the Opportunities Page if you are interested to work with us.

Filed Under: Uncategorized

  • « Go to Previous Page
  • Go to page 1
  • 2

Latest news

  • Vanessa Nasr (HSI lab PhD candidate) won the 2024 Student Member with Honors Award from the Human Factors and Ergonomics Society September 25, 2024
  • Dr. Zahabi Received IISE Hamed K. Eldin Outstanding Early Career IE in Academia Award from the Institute of Industrial and Systems Engineers (IISE) June 26, 2024
  • Dr. Zahabi received the TEES Young Faculty Award from Texas A&M University April 9, 2024
  • Junho Park (HSI lab PhD graduate) won the HFES 2023 student member with honors award September 8, 2023
  • Junho Park (HSI lab PhD graduate) joined Santa Clara University as Assistant Professor August 8, 2023

© 2016–2025 Human-System Interaction (HSI) Lab Log in

Texas A&M Engineering Experiment Station Logo
  • State of Texas
  • Open Records
  • Risk, Fraud & Misconduct Hotline
  • Statewide Search
  • Site Links & Policies
  • Accommodations
  • Environmental Health, Safety & Security
  • Employment