March 7, 2024 – (Los Angeles CA.)  Inspira AI, a company pioneering new AI cognitive frameworks, today announced the result of a pilot study to examine employee sensitivity to data privacy.

As part of the pilot study, two private companies agreed to test Inspira’s AI productivity platform, including an AI personality named HARRi who performs certain traditional management duties such as monitoring, coaching and encouraging employees towards optimum performance. The productivity system requires the collection of behavioral data in order to perform optimally, and thus a privacy agreement was required.

In the pilot study, 70 employees were presented with a privacy agreement that requested permission to collect sensitive data, and included language such as;

“While you are using the Software, we may collect data about you or your work-related activities including:

  • Time that is worked
  • What is worked on
  • Evidence of that work such as file names, typed content or screenshots
  • Behavior such as location, device usage or app usage
  • Productivity data such as typing speed, device interaction or attendance
  • Information that identifies you, such as name, email, phone or address”

The privacy agreement was easy to understand, with the data collection portion easy to see, located near the top, totaling only 211 words.

In addition, there was a data-sharing clause:

“Data may be shared with Affiliates, but only where your personal information is not put at obvious risk of being made public.”

The employees were not primed nor coached to accept the agreement and were given the opportunity to object and ask questions.

RESULTS: All 70 employees approved the agreement. 68, without question. Before accepting the agreement, two of the 70 employees asked “who the data would be shared with.” They were told “only essential companies closely related to our infrastructure, such as AWS”. Those two employees also then accepted the agreement.

Inspira CEO, Izzy Traub, commented “There is a lot of hype in public forums related to data privacy. This is partially due to the publication of data from studies that are conducted in a laboratory environment, rather than in the real world. In the laboratory, participants might be presented with hypotheticals such as; ‘Do you feel comfortable sharing personal data with your employer’? The artificial context of the question creates its own bias. But in the real world, people act differently than they do in surveys.”

On the surface, people say they do not want to share data. Still, when dealing with trusted entities, such as their employer, data sharing may be a trivial issue due to higher degrees of trust between the employee and employer. The argument for trust is consistent with other studies that confirm trust in the data collector plays a pivotal role in a user’s willingness to disclose personal data, even if the users are younger with a higher predisposition towards disclosure (Heirman et al., 2013; Predicting adolescents’ willingness to disclose personal information to a commercial website: Testing the applicability of a trust-based model).

Inspira intends to expand this study, but even now it gives strong indication that the hype around data privacy may be overstated in the media and by influencers who have an agenda to restrict the flow of information.

Traub adds “If AI is to succeed in making the world a better place, high quantities of data will be required. Our task is to adopt enterprise-class security protocols to keep data safe, employing a variety of accepted measures to protect it including encryption, firewalls and access control systems. There is no reason to restrict data flow when dealing with trusted entities. Restricting data in this context will only hurt our future economy as AI finds its place in our lives and our society”.

 

About Inspira

Inspira AI Corp is an A.I. SaaS company pioneering new cognitive frameworks and conversational agents for workforce optimization.


Media Contact: Fly@skypr.co