Looking Back on 2025 (and Ahead to 2026)

Happy New Year 2026! I honestly cannot believe it is already another year. Looking back, 2025 feels like it passed in a blur of late nights, deadlines, competitions, and moments that quietly changed how I think about learning. This blog became my way of slowing things down. Each post captured something I was wrestling with at the time, whether it was research, language, or figuring out what comes next after high school. As I look back on what I wrote in 2025 and look ahead to 2026, this post is both a reflection and a reset.

That sense of reflection shaped how I wrote this year. Many of my early posts grew out of moments where I wished someone had explained a process more clearly when I was starting out.

Personal Growth and Practical Guides

Some of my 2025 writing focused on making opportunities feel more accessible. I wrote about publishing STEM research as a high school student and tried to break down the parts that felt intimidating at first, like where to submit and what “reputable” actually means in practice.

I also shared recommendations for summer programs and activities in computational linguistics, pulling from what I applied to, what I learned, and what I wish I had known earlier. Writing these posts helped me realize how much “figuring it out” is part of the process.

As I got more comfortable sharing advice, my posts started to shift outward. Instead of only focusing on how to get into research, I began asking bigger questions about how language technology shows up in real life.

Research and Real-World Application

In the first few months of the year, I stepped back from posting as school, VEX Robotics World Championship, and research demanded more of my attention. When I came back, one of the posts that felt most meaningful to write was Back From Hibernation. In it, I reflected on how sustained effort turned into a tangible outcome: a co-authored paper accepted to a NAACL 2025 workshop.

Working with my co-author and mentor, Sidney Wong, taught me a lot about the research process, especially how to respond thoughtfully to committee feedback and refine a paper through a careful round of revision. More than anything, that experience showed me what academic research looks like beyond the initial idea. It is iterative, collaborative, and grounded in clarity.

Later posts explored the intersection of language technology and society. I wrote about AI resume scanners and the ethical tensions they raise, especially when automation meets human judgment. I also reflected on applications of NLP in recommender systems after following work presented at RecSys 2025, which expanded my view of where computational linguistics appears beyond the examples people usually cite.

Another recurring thread was how students, especially high school students, can connect with professors for research. Writing about that made me more intentional about how I approach academic communities, not just as someone trying to get a yes, but as someone who genuinely wants to learn.

Those topics were not abstract for me. In 2025, I also got to apply these ideas through Student Echo, my nonprofit focused on listening to student voices at scale.

Student Echo and Hearing What Students Mean

Two of the most meaningful posts I wrote this year were about Student Echo projects where we used large language models to help educators understand open-ended survey responses.

In Using LLMs to Hear What Students Are Really Saying, I shared how I led a Student Echo collaboration with the Lake Washington School District, supported by district leadership and my principal, to extract insights from comments that are often overlooked because they are difficult to analyze at scale. The goal was simple but ambitious: use language models to surface what students care about, where they are struggling, and what they wish could be different.

In AI-Driven Insights from the Class of 2025 Senior Exit Survey, I wrote about collaborating with Redmond High School to analyze responses from the senior exit survey. What stood out to me was how practical the insights became once open-ended text was treated seriously, from clearer graduation task organization to more targeted counselor support.

Writing these posts helped me connect abstract AI ideas to something grounded and real. When used responsibly, these tools can help educators listen to students more clearly.

Not all of my learning in 2025 happened through writing or research, though. Some of the most intense lessons happened in the loudest places possible.

Robotics and Real-World Teamwork

A major part of my year was VEX Robotics. In my VEX Worlds 2025 recap, I wrote about what it felt like to compete globally with my team, Ex Machina, after winning our state championship. The experience forced me to take teamwork seriously in a way that is hard to replicate anywhere else. Design matters, but communication and adaptability matter just as much.

In another post, I reflected on gearing up for VEX Worlds 2026 in St. Louis. That one felt more reflective, not just because of the competition ahead, but because it made me think about what it means to stay committed to a team while everything else in life is changing quickly.

Experiences like VEX pushed me to think beyond my own projects. That curiosity carried into academic spaces as well.

Conferences and Big Ideas

Attending SCiL 2025 was my first real academic conference, and writing about it helped me process how different it felt from school assignments. I also reflected on changes to arXiv policy and what they might mean for openness in research. These posts marked a shift from learning content to thinking about how research itself is structured and shared.

Looking across these posts now, from robotics competitions to survey analytics to research reflections, patterns start to emerge.

Themes That Defined My Year

Across everything I wrote in 2025, a few ideas kept resurfacing:

  • A consistent interest in how language and AI intersect in the real world
  • A desire to make complex paths feel more navigable for other students
  • A growing appreciation for the human side of technical work, including context, trust, and listening

2025 taught me as much outside the classroom as inside it. This blog became a record of that learning.

Looking Toward 2026

As 2026 begins, I see this blog less as a record of accomplishments and more as a space for continued exploration. I am heading into the next phase of my education with more questions than answers, and I am okay with that. I want to keep writing about what I am learning, where I struggle, and how ideas from language, AI, and engineering connect in unexpected ways. If 2025 was about discovering what I care about, then 2026 is about going deeper, staying curious, and building with intention.

Thanks for reading along so far. I am excited to see where this next year leads.

— Andrew

4,811 hits

Humanoid Robot Forum 2025: Where Industrial Innovation Takes Center Stage

If you’re as interested in the future of robotics as I am, here’s an event you’ll want to keep an eye on. The Humanoid Robot Forum 2025 is happening on September 23, 2025, in Seattle (my city), Washington. Organized by the Association for Advancing Automation (A3), this one-day event brings together experts from the robotics and AI industries to explore how humanoid robots are being developed and deployed in real-world settings.

What makes this event exciting to me is that it focuses not just on hardware, but also on how technologies like AI and simulation are shaping the next generation of human-like robots. One of the keynotes I’m especially looking forward to is from Amit Goel, Head of Robotics Ecosystem at NVIDIA. His talk, “Advancing Humanoid Robotics Through Generative AI and Simulation,” will dive into how generative AI can help design, train, and test robot behaviors in simulated environments before deploying them in the real world. As someone who’s been exploring AI and NLP through my own projects, this intersection of AI and robotics is something I’m eager to learn more about.

The full agenda includes sessions and speakers from:

  • Diligent
  • Apptronik
  • Agility Robotics
  • PSYONIC
  • GXO
  • Association for Advancing Automation (A3)
  • Boston Dynamics
  • UCSD Advanced Robotics and Controls Lab
  • WiBotic
  • Cobot
  • NVIDIA
  • Cambridge Consultants
  • Toyota Research Institute
  • Sanctuary AI
  • True Ventures

Topics will include scaling up robotic hardware, AI-driven perception and control, power management, investment trends, and more. For anyone curious about how humanoid robots might start appearing in warehouses, hospitals, or even homes, this forum gives a front-row seat to what’s happening in the field.

Even though I won’t be attending in person (I’ve got school, college apps, and robotics season keeping me busy), I’ll definitely be keeping an eye out for takeaways and speaker highlights.

You can check out the full agenda and register for the event here:
👉 Humanoid Robot Forum 2025

— Andrew

How NLP Helps Robots Handle Interruptions: A Summary of JHU Research

I recently came across an awesome study from Johns Hopkins University describing how computational linguistics and NLP can make robots better conversational partners by teaching them how to handle interruptions, a feature that feels basic for humans but is surprisingly hard for machines.


What the Study Found

Researchers trained a social robot powered by a large language model (LLM) to manage real-time interruptions based on speaker intent. They categorized interruptions into four types: Agreement, Assistance, Clarification, and Disruption.

By analyzing human conversations from interviews to informal discussions, they designed strategies tailored to each interruption type. For example:

  • If someone agrees or helps, the robot pauses, nods, and resumes speaking.
  • When someone asks for clarification, the robot explains and continues.
  • For disruptive interruptions, the robot can either hold the floor to summarize its remaining points before yielding to the human user, or it can stop talking immediately.

How NLP Powers This System

The robot uses an LLM to:

  1. Detect overlapping speech
  2. Classify the interrupter’s intent
  3. Select the appropriate response strategy

In tests involving tasks and conversations, the system correctly interpreted interruptions about 89% of the time and responded appropriately 93.7% of the time.


Why This Matters in NLP and Computational Linguistics

This work highlights how computational linguistics and NLP are essential to human-robot interaction.

  • NLP does more than generate responses; it helps robots understand nuance, context, and intent.
  • Developing systems like this requires understanding pause cues, intonation, and conversational flow, all core to computational linguistics.
  • It shows how multimodal AI, combining language with behavior, can enable more natural and effective interactions.

What I Found Most Interesting

The researchers noted that users didn’t like when the robot “held the floor” too long during disruptive interruptions. It reminded me how pragmatic context matters. Just like people expect some rules in human conversations, robots need these conversational skills too.


Looking Ahead

This research expands what NLP can do in real-world settings like healthcare, education, and social assistants. For someone like me who loves robots and language, it shows how computational linguistics helps build smarter, more human-friendly AI systems.

If you want to dive deeper, check out the full report from Johns Hopkins:
Talking robots learn to manage human interruptions

— Andrew

How Computational Linguistics Is Powering the Future of Robotics?

As someone who’s been involved in competitive robotics through VEX for several years and recently started diving into computational linguistics, I’ve been wondering: how do these two fields connect?

At first, it didn’t seem obvious. VEX Robotics competitions (like the one my team Ex Machina participated in at Worlds 2025) are mostly about designing, building, and coding autonomous and driver-controlled robots to complete physical tasks. There’s no direct language processing involved… at least not yet. But the more I’ve learned, the more I’ve realized that computational linguistics plays a huge role in making real-world robots smarter, more useful, and more human-friendly.

Here’s what I’ve learned about how these two fields intersect and where robotics is heading.


1. Human-Robot Communication

The most obvious role of computational linguistics in robotics is helping robots understand and respond to human language. This is powered by natural language processing (NLP), a core area of computational linguistics. Think about assistants like Alexa or social robots like Pepper. They rely on language models and parsing techniques to interpret what we say and give meaningful responses.

This goes beyond voice control. It’s about making robots that can hold conversations, answer questions, or even ask for clarification when something is unclear. For robots to work effectively with people, they need language skills, not just motors and sensors.


2. Task Execution and Instruction Following

Another fascinating area is how robots can convert human instructions into actual actions. For example, if someone says, “Pick up the red cup from the table,” a robot must break that down: What object? What location? What action?

This is where semantic parsing comes in—turning language into structured data the robot can use to plan its moves. In VEX, we manually code our autonomous routines, but imagine if a future version of our robot could listen to instructions in plain English and adapt its behavior in real time.


3. Understanding Context and Holding a Conversation

Human communication is complex. We often leave things unsaid, refer to past ideas, or use vague phrases like “that one over there.” Research in discourse modeling and context tracking helps robots manage this complexity.

This is especially useful in collaborative environments. Think hospital robots assisting nurses, or factory robots working alongside people. They need to understand not just commands but also user intent, tone, and changing context.


4. Multimodal Understanding

Robots don’t just rely on language. They also use vision, sensors, and spatial awareness. A good example is interpreting a command like, “Hand me the tool next to the blue box.” The robot has to match those words with what it sees.

This is called multimodal integration, where the robot combines language and visual information. In my own robotics experience, we’ve used vision sensors to detect field elements, but future robots will need to combine that visual input with spoken instructions to act intelligently in dynamic spaces.


5. Emotional and Social Intelligence

This part really surprised me. Sentiment analysis and affective computing are helping robots detect emotions in voice or text, which makes them more socially aware.

This could be important for assistive robots that help the elderly, teach kids, or support people with disabilities. It’s not just about understanding words. It’s about understanding people.


6. Learning from Language

Computational linguistics also helps robots learn and adapt over time. Instead of hardcoding every behavior, researchers are working on ways for robots to learn from manuals, online resources, or natural language feedback.

This is especially exciting as large language models continue to evolve. Imagine a robot reading its own instruction manual or watching a video tutorial and figuring out how to do a new task.


Looking Ahead

While none of this technology is part of the current VEX Robotics competition (at least not yet), understanding how computational linguistics connects to robotics gives me a whole new appreciation for where robotics is going. It also makes me excited about studying this intersection more deeply in college.

Whether it’s through smarter voice assistants, more helpful home robots, or AI systems that respond naturally, computational linguistics is quietly shaping the next generation of robotics.

— Andrew

Ex Machina Gears Up for VEX Worlds 2026 in St. Louis

After an incredible season last year where our team, Ex Machina, competed at the VEX Robotics World Championship 2025, I’m excited to share that we’re back for another season! I’ll continue competing this season as a team member of Ex Machina, building on everything we learned from competing together at the global championship.


A New Season, A New Challenge

This year’s game for the VEX V5 Robotics Competition has been announced, and it looks both challenging and fun. Here is the official game reveal video so you can see what teams will be working on this season:

Watch the VEX V5 Robotics Competition 2026 Game Reveal

From the initial reveal, I can already tell that strategy, design innovation, and precise teamwork will be key to succeeding this year.


Balancing Robotics and College Applications

This season is going to be especially busy for me and my teammates. As rising seniors, we’re all deep into the college application process. Between essays, interviews, and preparing for upcoming deadlines, our schedules are definitely packed. But despite the workload, we’ve all decided to continue competing. Robotics has been such an important part of our high school journey, and we’re passionate about pushing ourselves further as a team in our final season together.


VEX Worlds 2026 Heads to St. Louis

There’s another big change this year: for 2026, the VEX Robotics World Championship is moving to St. Louis, Missouri! For the past few years, the event was held in Dallas, Texas, so this will be a new experience for everyone.

The championship will be held in April 2026 at the America’s Center Convention Complex in downtown St. Louis, with specific dates to be announced later. You can read more details about the upcoming event on the REC Foundation’s official page.

Here is a video introducing VEX Worlds 2026 in St. Louis to get you excited for what’s ahead:

VEX Robotics World Championship Heads to St. Louis in 2026


Looking Ahead

It feels both exciting and bittersweet to enter my final year of high school robotics. I know the journey ahead will be intense with balancing robot design, programming, and competition prep alongside college applications, but I’m ready for the challenge.

I’ll keep sharing updates about our season as we start building and competing, so stay tuned to see how Ex Machina continues to grow in 2026.

— Andrew

Blog at WordPress.com.

Up ↑