What AI Means for Research

Last Updated 01/12/2024

We have been inundated recently with messaging about the dangers and risks that artificial intelligence (AI) poses to educators, researchers, and an informed public. A steady stream of editorials and opinion pieces warns us about the pernicious effects of ChatGPT and other AI applications. But how much of that messaging is fearmongering? And can we really afford not to explore AI’s obvious potential?

Those of us involved with the Learn & Work Ecosystem Library are well aware of this tension. We launched the Library in December 2022 to provide an online, open-source collection of the latest research and resources related to education, training, and work. No such collection would be complete—or even useful—if it ignored AI.

We know that AI tools are already being embraced and monitored in the learning-and-work ecosystem. We all have a responsibility in assessing the utility— and the challenges — of this powerful technology for identifying and curating information.

Yet educators and education officials at all levels are concerned that students could be misusing AI or over-relying on it for their learning. Many educators worry that these tools could erode education quality.

Some examples: In early November 2023, Anthology, an education technology company, released the results of a global research study, Comparing Global University Mindsets and Student Expectations. The study, involving over 5,000 students and university leaders from 11 countries, showed that almost one in four U.S. students are unfamiliar with or do not use generative AI writing tools, while 38% use them frequently. However, over half of U.S. students expect to increase their use of such tools within six months.

When it comes to university leaders, only 26% report using AI tools frequently or occasionally. University leaders are not particularly bullish on the use of the tools and are worried about their implications for plagiarism and inequity. Despite this, 45% of university leaders say they plan to increase their use of AI tools in the next six months and indicate optimism that the tools can help support course operations.

The big concern—and a major shortfall of AI as an information-dissemination tool—is its dearth of citations. Too often, users have no idea where the information has come from. Already, multiple instances have been reported of inaccurate information being generated by AI tools. While chatbots display impressive speed and agility in ideation and brainstorming, the answers they generate often lack valuable context, framing, and accuracy. In short, AI is not yet a reliable research method. On the contrary, it is plagued by serious gaps.

The Learn & Work Ecosystem Library is working to fill those gaps. As Matthew Valdez, its librarian, recently outlined in an interview, one of the Library’s greatest strengths is that it contextualizes information by providing attributions and citations. Rather than being at the mercy of a bot—which tries hard to provide the answer it thinks a user wants—users of the Library’s search tools are empowered to explore and learn more about various topics and initiatives.

Valdez cited other features of the Library that increase its value for users: It provides active links to external sources, so users can confirm their validity and note the dates on which the information was published. It also links users to additional resources that they can use to explore other avenues of inquiry. In a field as diverse and dynamic as education and workforce development, the possibilities for inquiry are endless.

Another important difference between AI tools and the Library is the latter’s ability to effectively “de-silo” or aggregate information from multiple sources. That’s a vital feature, given the siloed nature of information in the field of education and workforce development. Though the field itself is highly interconnected, much of its related research is handled discreetly, with many important studies tucked away behind paywalls, hosted on different platforms, or presented without context of their place in the ecosystem. AI tools source information by pulling from specific knowledge bases, and this can actually lead to greater isolation or siloing. Because the Library presents topics and initiatives in relational groups, it can be a great equalizer in providing relevant and informative content.

This being said, the Library is working with a tech partner to improve the website’s organization by incorporating AI learning. The vision is that AI will help the Library link entries together by category and topic, so that users can navigate the website more easily.

To dig in further, the Library will partner with AI in 2024 to develop relational maps. The maps, which will complement the arrangement of searchable components, will show how specific initiatives are related to various organizations, glossary terms, topics, and other initiatives in the Library. The Library will be able to continuously train and refine the AI model.

Though AI has many critics, the Library is ready to leverage it as a tool, and to try to fill in some gaps as AI strives for greater reliability. In the meantime, the Library can serve as a valuable research tool that avoids the worst pitfalls of AI while filling a void in the information space.

Have something to submit?

For the ecosystem to function effectively, all parts of the system must be connected and coordinated.

Organizations (274)

Initiatives (298)

Topics (93)