A commercial real estate AI expert said companies should create AI policies based on privacy and security issues and then build on them as needed.
MINNEAPOLIS – At a panel discussion talking through artificial intelligence use in the workplace, specifically for commercial real estate, Dan Williamson, director of AI for the commercial real estate developer Ryan Companies, said that companies shouldn’t seek to solve all their AI-related problems in one policy right away but should just focus on getting a policy out there.
Williamson said a good place to start for phrasing AI policy is by looking at a company’s social media policy, but he said the best thing to do is “just get something out there.”
“You’re never going to solve all the problems in your first policy,” he said. “The landscape changes so fast the place to start is around security and privacy, those types of things.”
Also on the panel was Garrio Harrison, the chief growth officer at financial adviser Stoneford and David Nguyen, a professor at the University of Minnesota. It was put on by the Minnesota chapter of the Commercial Real Estate Women, or MNCREW.
Williamson said that privacy issues appear because the information fed to language learning model AI, like OpenAI’s ChatGPT or Google’s Gemini (formerly Bard), is now accessible by the company that owns the AI. Nguyen said that free models are always taking your information and turning them into data, and that you should read terms and conditions for the paid models too.
Companies have said in the past they won’t track the data and then news comes out that they were, Nguyen said, pointing to the example of Amazon using information consumers give to the popular virtual smart speaker Alexa.
“You have to read the terms and services of these tools, and it’s not that fun to read either, but read them and see what they say that they want to do with your data,” Nguyen said.
When using AI, Nguyen and Williamson emphasized the importance of understanding how to prompt the learning language models correctly with the right data. Only use data you can trust because otherwise you can’t trust what the AI tells you in response.
Harrison said he recommends companies use the “10-80-10” method when using AI. Ten percent of your time should be spent thinking about what you want from the machine, 80% of your time should be spent on prompting the actual bot and the final 10% is checking to make sure the results are what you wanted and correct.
Williamson also said that as AI becomes more prominent, he expects more and more data centers are going to be built and more energy is going to be consumed to power those data centers.
There have been a handful of data centers recently announced throughout the Twin Cities metro area. Last week, Rosemount became the home of Meta’s upcoming $800 million data center. In early January, a data center in Chaska traded hands for $31 million. CloudHQ has been developing a 1.4 million-square-foot data center also in Chaska.
Copyright © 2024 BridgeTower Media; and © 2024, Finance & Commerce (Minneapolis, MN). All rights reserved.
©Florida Realtors®
Source link