data

ASU brings cybersecurity leaders and students together to talk about the future of data privacy

As our daily interactions become increasingly digitized, navigating how data is collected, stored and used by individuals or organizations has become an ever-evolving discussion centered on data privacy and digital autonomy.

To recognize Data Privacy Day – which takes place every year on January 28 – ASU’s Enterprise Technology brought over 175 participants together for a virtual, two-hour discussion aimed at educating and engaging the community on the future of data privacy and information security. Moderated by Dr. Donna Kidwell, ASU's chief information security and digital trust officer, a panel of ASU student leaders and cybersecurity experts shared their insights. 

Check out highlights from the discussions, which centered on the importance of individual and institutional accountability in protecting personal data. 

On emerging artificial intelligence and ChatGPT:  “Somewhere around early December, many of our high school teachers might have felt that they were experiencing a Christmas miracle as new essays came from incredibly cogent students that were having trouble writing all semester,” Kidwell said. “With the advent of things that could be incredible teaching tools, there are some really interesting privacy implications and opportunities with artificial intelligence.”

Caitlin Fennessy, the chief knowledge officer at the International Association of Privacy Professionals and keynote for the virtual event, then launched into a conversation about the future of data privacy and government oversight.

The Enterprise Technology News Room

On the future of U.S. privacy laws and regulations: “We expect even more states to jump into the bill writing game this year and would predict at least a couple more to be adopted,” said Fennessy when speaking on recent privacy regulations passed in California, Colorado, Connecticut, Utah and Virginia. “Across the U.S., one of the things we're watching closely is how both states and organizations are responding to the new requirements to recognize global privacy controls.”

On the future of advertising technology: “Ad tech is truly in for a reckoning this year. Not only because of the European Union's enforcement actions, but more broadly because of the new rules in the U.S. – most notably in the state of California,” said Fennessy. “The requirement to recognize opt-out requests for the sale of data — which clearly includes cross-contextual behavioral advertising under the law — allows people to choose how sensitive data is used. There is a growing focus from the Federal Trade Commission and from the administration on how data is used and tracked across the Internet and whether that use is appropriate.”

On the future of privacy transparency: “I think one of the major takeaways from recent privacy cases is that there is a need for heightened transparency and informed understanding of all the data processing,” said Fennessy. “I think this is important for companies — even outside of the advertising technology space — to be aware of because so many entities put things like behavioral advertising as one part of very long terms of service and bury those provisions.”

Andrew Lukosus, manager of data analysis at the W.P. Carey School of Business and participant at the event, expanded on data transparency by sharing: “I'd like to see the use of the ‘cookie’ euphemism go away. Instead of a pop up message asking a user if they will allow cookies, I'd like the message to clearly and succinctly ask the user if they will allow their data to be tracked and sold. A cookie sounds harmless and like something many users probably want, especially kids. Their data being tracked and sold is probably something many users don't want. Essentially, I find the use of the ‘cookie’ term deceitful.”

Kidwell then welcomed five student panelists from across the university to deep dive into the future of cybersecurity, media ethics and protecting online identities. 

On stored social media posts:  “I don't think people realize that social media posts — even if you remove them — could be archived somewhere and still be found,” said Christopher Earles, a computer science major focusing on cybersecurity at the Ira A. Fulton School of Engineering. “I'm not saying don't share things, but their posts could have ramifications if people want to get into a more professional realm. Maybe social media companies should have some type of warning when you post a picture like, ‘hey, like this won't ever go away.’”

On predictive generative artificial intelligence and password security: “Recently, I set my grandma up with a password book. She was using a handful of stapled note cards and would organize her passwords by crossing out the old and writing in the new using different handwriting or a different pen color,” shared Kale Bassler, a psychology and criminology major at The College of Liberal Arts and Sciences. “She's also a very conversational person and I worry with chatbots or AI technology, that she might just start having a conversation as if she’s talking with a real person, not assuming that all of this information is being collected and stored in a database somewhere. I worry for older folks who might not realize the information they're saying is going to be stored and could be used outside of that conversation without their consent — potentially giving away critical password information.”

On informing audiences on artificial intelligence: “In the early stages of emerging AI programs, companies should display information that informs users of best practices. For example, users shouldn’t share personal information or purchases histories,” said Nayan Nahar, a mechanical engineering major at the Ira A. Fulton School of Engineering. 

On cyberbullying and social media ethics: “My peers and I were not very well informed on cyberbullying. The first cyberbullying course was in 8th grade — which is too late because I was on social media before that. Because of that late education, we weren’t able to recognize the signs of cyberbullying when it was happening. I think in social media apps, there should be some sort of safety notification that teaches young people about potentially harmful interactions as we create an account,” said Saumya Lamba, a molecular biosciences and biotechnology major at The College of Liberal Arts and Sciences.

As we move forward in the ever-expanding digital era, the future of cybersecurity and data privacy is intricately tied to our ability to be proactive and innovative. From implementing regulations to fostering transparency, it will take a collaborative effort to ensure the digital landscape grows with our community enabling us to work, learn and thrive. 

For more, follow the conversation at #StaySafeOnline

Written by Kevin Pirehpour