As a board member, Joanne Chen knows artificial intelligence better than many of her peers. Chen is a general partner with Foundation Capital, a venture-backed firm largely focused on seed-stage investing, with an emphasis on fintech and enterprise. I invest in the enterprise space with a slant of applying A.I. to the enterprise, says San Franciscobased Chen, who studied computer science at University of California, Berkeley.

About 75% of Foundations enterprise portfolio uses A.I., whether its machine learning, computer vision, or the buzzed-about large language models, she explains. Among those businesses are Jasper, an A.I. copywriting assistant; and infrastructure players like Anyscale, whose clients include OpenAI, creator of ChatGPT, the popular generative A.I. chatbot.

Theres a lot of opportunity but also a lot of downsides, Chen says, singling out fake content and security problems. Thats why we invest in both the applications of A.I. as well as data security, things that are counterbalancing the downsides of that technology.

Chen sits on more than 10 portfolio company boards. Asked how directors can grasp what role A.I. should play in their organization, along with its potential risks, she emphasizes governance. The board is really the forum to ask questions, not to have, necessarily, the answers.

With A.I. poised to transform many industries, corporate directors cant ignore this rapidly advancing technology. For boards and their organizations, it offers great promise but also presents daunting challenges. Directors must provide useful oversight of A.I. strategy and ensure that they educate themselves.

Missteps involving A.I. can be costly, as Google parent company Alphabet learned when its market value recently plunged by $100 billion after the underwhelming launch of the Bard chatbot. If youre in a regulated industry and youre trying to apply A.I., I think youve got to be very, very careful, Chen warns. Because the consequences of applying A.I. in an incorrect way, where that leads to poor advice or poor results, can be really, really bad.

Most boards could stand to raise their A.I. game. In a recent survey of 500 U.S.-based senior executives by law firm Baker McKenzie, only 41% said they have A.I. expertise at the board level. And even though all respondents agreed that using A.I. comes with risks, just 4% considered them significant.

The first number sounds suspiciously high to Alan Dignam, professor of corporate law at Queen Mary University of London. I think levels of actual knowledge are pretty poor, says Dignam, who is writing a book about the A.I.-driven organizational transformations that lie ahead.

I dont know whether that reflects Ive downloaded ChatGPT and Ive downloaded [image generation engine] DALL-E, and Ive had a look at it, he adds of the 41% figure. But Id be really, really surprised if that meant real expertise at board level. Deep knowledge of A.I. is scarce, Dignam notes. Computer scientists arent experts in it, because its not maths, its not computer science, its statistics.

Directors awareness of A.I. has risen over the past several years, says Beena Ammanath, executive director of the Global Deloitte AI Institute. Every board is now aware they need some level of A.I. savvy and A.I. education, and they need to have an understanding of the risks that come with using A.I.

Board composition has changed as a result, observes Ammanath, who also leads the Trustworthy Tech Ethics practice at Deloitte and is the author of Trustworthy AI. Theres definitely been that slow shift to include a tech executive as part of the board, just to raise the tech IQ, Ammanath says.

Jeanne Kwong Bickford, managing director and senior partner with Boston Consulting Group, sees parallels with cybersecurity a few years ago. Cybersecurity is the downside risk of digitization, and digitization is a major strategic imperative for a lot of organizations, says Bickford, one of her firms risk and compliance leaders. But historically, boards lacked expertise in or knowledge of cybersecurity because it was an emerging risk.

Theyre now in a similar situation with A.I., Bickford notes. Originally you think of it as, Well, thats really for management to do because its a very technical topic, she says. But the reality is, A.I. itself is also strategic in nature because it is a disruptive technology that can allow for massive innovation, both on the revenue side but also from a cost and a process side.

In addition to understanding how to use A.I. for competitive advantage, boards must know the downside risks of a sometimes unproven technology, Bickford says. Those risks are also strategic, she explains. The downside risk of it forces a board to think about corporate social responsibility, the purpose and values of the company itself, because some of the decisions on how you use A.I. will test those values, and the board needs to be able to engage on that.

Ammanath highlights the brand and reputational risks of A.I. To use a timely example, a chatbot might produce misogynistic content. For Ammanath, its a matter of understanding the places within the organization that A.I. is being used and what kind of ethical tech checks have been done, and getting the best practices from the rest of the industry or from other organizations.

Because all A.I. should be responsible, the organization must first set out its responsible A.I. policy, Bickford asserts. What are those principles that guide how a firm will or will not use A.I.? she asks. We really do believe that responsible A.I. cant be separate from normal risk management and good governance.

Regulation is another key factor for Bickford, who points to the European Unions proposed AI Act. And there are various national, regional, state, and local regulations that are coming up, [so] that this also becomes a fiduciary responsibility from just a legal and regulatory perspective. So all of those things force the board to have to engage on A.I.

For any company that plans to roll out an A.I. strategy, Chen reels off a list of questions that directors should ask management. What are the goals for the strategy? How is this impacting top line, bottom line? What are the resources we need? she says. In the case of unintended consequences, what are the guardrails that you guys are enforcing in advance? How are competitors using this technology?

Dignams advice: Be skeptical about sales pitches for A.I. products. [Ask] really, really detailed questions about what this thing was tested on, he counsels. Theres an awful lot of mis-selling; theres an awful lot of misunderstanding by boards as to what theyre buying.

In Dignams experience, organizations seeking to harness A.I. often misread how dramatically they must change. The No. 1 thing that a board needs to understand [is] that if youve got the type of business that could take advantage of A.I., then really, you are talking about transforming your business into one primarily designed around high-quality internal data generation, which you can then use A.I. with to help you analyze patterns.

Businesses going that route wont need to resemble todays companies, predicts Dignam, who thinks many executive roles will become superfluous. The board will move closer to production; itll move closer to the product. Shareholders will move closer to the product as well, he says. Im not sure that in 10 years, well need boards in the way that we use them now for certain industries.

In health care and other regulated industries, companies are usually conservative about deploying A.I., Chen relates. Maybe theyre applying it to support functions versus the core product first, she says, and then thinking about what guardrails they can bring in.

For a board to talk capably about strategic direction when it comes to A.I., not every director needs to be facile in the technology, Bickford reckons. You would hope or aspire to have at least one member that has enough of that depth to be able to help guide the rest of the board in those conversations around the possibilities and opportunities of A.I.

Boards could also have an A.I. expert give a presentation or hold regular advisory meetings, Chen suggests. The trick, though, is to find someone who understands how to commercialize a product using A.I., as well as someone who understands A.I. sufficiently to see the impact.

Whats the best way for directors to get up to speed on artificial intelligence? Universities offer A.I. fluency training, and many companies have in-house programs, Ammanath says. But A.I. for board members is a training that I think should be mandated, she says, recommending that it become part of their certification.

And besides regulation, what trends should boards be watching?

Generative A.I. has changed the game by democratizing access, Bickford says. Previously, I think, in a board or even management, the development and deployment of A.I. was actually quite controlled, she explains. With generative, its embedded in products you buy, so your third-party vendors, you can buy it from them. You might not fully understand or vet how it can or cant be used, or the quality of what you just purchased. Meanwhile, an employee might decide to use a generative app in their work.

For boards, all that access creates a new urgency because the normal governance processes dont apply so well, Bickford explains. You have a lot of shadow A.I. that exists in an organization, which I think also can push a board to actually pay attention and also help set what the guidelines are.

Ammanath expects A.I. and other tech to get more attention at the committee level. I wouldnt be surprised in the future if we have some form of technology committee or subcommittee, she says.

Dignam has a somewhat different take: If [directors are] serious about looking at it, they need a subcommittee of the board to look at organizational transformation around high-quality data and utilizing A.I.

More advanced boards have started to consider new technologies that are blending with artificial intelligence, Ammanath says, citing the metaverse. Its important [for directors] to also look beyond A.I. at other emerging technologies and educate themselves, and know how to assess and govern it as a board member.


Newspapers

Spinning loader

Business

Entertainment

POST GALLERY