At the 2026 EKBN Forum, our Managing Director Trish Martin led a session called Making Sense of AI Together. The focus was not the latest AI tools or a roadmap for adoption. Instead, it explored something many institutions are realizing is harder than the technology itself: making sense of AI together, fast enough and responsibly enough to act.
Over the past year, Monstarlab has worked closely with higher education institutions navigating this shift through product design partnerships, digital modernization efforts, and a series of national roundtables we convened in Fall 2025. These conversations brought together leaders from universities and community colleges to discuss how they are approaching AI in real time.
The tone was pragmatic. Participants were hopeful about the possibilities but clear-eyed about the realities of policy, trust, and institutional capacity. Across these conversations, four patterns consistently emerged.
1. Most institutions are organizing, not operationalizing
AI task forces are everywhere. Nearly every institution in our roundtables had created a committee or cross-campus initiative focused on AI.
What remains rare is systemic implementation. Much of the work is exploratory and distributed. As one participant described it, AI efforts often represent “one percent of a hundred people’s jobs.”
Task forces create permission to explore. But without shared understanding of what AI should enable, they rarely generate sustained momentum.
2. Equity and ethics are everyday operational questions
In public discourse, AI ethics often sound abstract. On campuses, they appear as operational constraints.
Innovation is not slowing because institutions lack imagination. It is slowing because the guardrails are still forming.
One of the most important shifts we heard was around assessment. Several institutions are moving away from detecting AI use and toward designing assignments that assume it.
Students may use AI, but they must show how. Prompts, iterations, reflections, and oral defenses make thinking visible again.
When the question shifts from “How do we stop this?” to “What do we want students to demonstrate?” the conversation moves forward.
3. Value is emerging through small, unglamorous wins
Despite the attention AI receives in headlines, the most meaningful progress we see is happening in modest, practical ways.
One institution described using a custom AI model to review course structure and engagement data. What previously required 16 to 18 hours of manual review per course can now be completed in 2 to 4 hours with human oversight.
Others are experimenting with scheduling agents, advising dashboards, and AI-assisted feedback tools that reduce faculty workload while improving student support.
None of these examples are flashy. But they are visible, measurable, and credible, qualities that matter far more than ambition when institutions decide whether to move forward.
Small wins create trust. Trust creates momentum.
4. The real bottleneck is shared understanding
The most consistent barrier we heard about was not technological. It was relational.
Faculty are experimenting with new tools.
Without shared language and understanding, innovation fragments. This is where knowledge brokering becomes essential.
Knowledge brokers do not necessarily provide answers. They make emerging patterns legible. They surface assumptions, translate risk into design constraints, and carry learning across organizational boundaries.
In a moment moving this quickly, that role is becoming infrastructure.
What these patterns signal for higher education leaders
These conversations suggest that the institutions moving fastest are not those adopting the most technology. They are the ones building shared understanding across roles. That often means:
This is where thoughtful digital product design and systems thinking become critical.
A question that emerged at EKBN
During the EKBN discussion, another important theme surfaced: a growing gap between higher education and industry when it comes to AI expectations. Participants asked questions like:
Higher education is not only interpreting a rapidly changing technology. It is also interpreting a rapidly evolving labor market.
Three practices institutions can start today
Across the roundtables and the EKBN discussion, a few practical practices emerged that institutions can adopt without waiting for perfect clarity.
1. Create a shared language for AI use.
Instead of focusing on specific tools, define levels of acceptable use within learning contexts. For example: open use, guided use, or prohibited use. Clear categories reduce confusion for both students and faculty.
2. Establish repeatable cross-role dialogue.
AI touches teaching, operations, legal policy, and technology infrastructure. Institutions benefit from regular forums where faculty, IT, leadership, and policy experts can discuss emerging issues together rather than in isolation.
3. Maintain a visible artifact of learning.
Document what is being tested, what is working, and what is changing. A living playbook or knowledge repository allows insights from small pilots to inform broader institutional strategy.
Moving forward together
Higher education is no longer deciding whether AI matters. That question has already been answered.The real question now is whether institutions can build shared understanding quickly enough to guide responsible action.
What stood out most in our conversations was not fear or resistance. It was thoughtful experimentation. Educators and leaders are exploring, questioning, and adapting in ways that reflect the complexity of their mission.
At Monstarlab, we believe this moment calls for collaboration between educators, technologists, and designers. Institutions do not need to navigate these shifts alone. The most effective strategies emerge when leaders combine expertise in pedagogy, digital systems, and human-centered design.
If the sector continues to center dialogue, transparency, and collaboration, the path from experimentation to impact will become clearer for everyone. And perhaps most importantly, it will remain a collective effort.
Continuing the conversation
Monstarlab works with higher education institutions to design digital products, modernize learning systems, and explore responsible AI applications that support teaching, operations, and student success.
If you’re interested in discussing how these patterns are showing up on your campus, we’d love to continue the conversation.
", "publishedDate": 1773183600000, "relatedBlogs": {}, "seo": { "description": "At the 2026 EKBN Forum, our Managing Director Trish Martin led a session called Making Sense of AI Together. " }, "site": "americas", "slug": "making-sense-of-ai-together-what-higher-education-is-learning-in-real-time", "thumbnailImage": "https://cdn.builder.io/api/v1/image/assets%2Ffb3ccc876dd442c6ae31d776377e35db%2F97e57af360d647fc924f223c66b190a6", "timeToRead": "4 mins", "title": "Making Sense of AI Together: What Higher Education Is Learning in Real Time" }Copyright © 2006-2026 Monstarlab All Rights Reserved.