AI in Schoolwork: Difference Approaches Taken in the U.S. and China

Recently, I read an article from MIT Technology Review titled “Chinese universities want students to use more AI, not less.” It really made me think about the differences in how the U.S. and China are approaching AI in education, especially as a high school student growing up in Washington state.

In China, AI has gone from being a taboo to a toolkit in just a couple of years. University students once had to find mirror versions of ChatGPT through secondhand marketplaces and VPNs just to access the tools. Back then, professors warned students not to use AI for assignments. But now, things have completely changed.

Chinese universities are actively encouraging students to use generative AI tools, as long as they follow best practices. Professors are adding AI-specific lessons to their classes. For example, one law professor teaches students how to prompt effectively and reminds them that AI is only useful when combined with human judgment. Students are using tools like DeepSeek for everything from writing literature reviews to organizing thoughts.

This push for AI education isn’t just happening in individual classrooms. It’s backed by national policy. The Chinese Ministry of Education released guidelines in April 2025 calling for an “AI plus education” approach. The goal is to help students develop critical thinking, digital fluency, and real-world skills across all education levels. Cities like Beijing have even introduced AI instruction in K–12 schools.

In China, AI is also viewed as a key to career success. A report from YiCai found that 80 percent of job listings for recent college grads mention AI as a desired skill. So students see learning how to use AI properly as something that gives them a competitive edge in a tough job market.

That’s pretty different from what I’ve seen here in the U.S.

In July 2024, the Washington Office of Superintendent of Public Instruction (OSPI) released official guidance for AI in schools. The message isn’t about banning AI. It’s about using it responsibly. The guidance encourages human-centered learning, with values like transparency, privacy, equity, and critical thinking. Students are encouraged to use AI tools to support their learning, but not to replace it.

Instead of secretly using AI to write a paper, students in Washington are encouraged to talk openly about how and when they use it. Teachers are reminded that AI should be a support, not a shortcut. The guidance also warns about overusing AI detection tools, especially since those tools can sometimes unfairly target multilingual students.

Adding to this, a recent brain-scan study by MIT Media Lab called “Your Brain on ChatGPT: Accumulation of Cognitive Debt when Using an AI Assistant for Essay Writing Task” raises some interesting points. Over four months, participants had their brains scanned while using ChatGPT for writing tasks. The results were surprising:

  • 83% of AI users couldn’t remember what they had just written
  • Brain activity dropped by 47% in AI users and stayed low even after stopping
  • Their writing was technically correct but described by teachers as robotic
  • ChatGPT made users 60% faster, but reduced learning-related brain activity by 32%

The group that performed the best started their work without AI and only added it later. They had stronger memory, better brain engagement, and wrote with more depth. This shows that using AI right matters. If we rely on it too much, we might actually learn less.

MIT’s full research can be found here or read the paper on arXiv. (a caveat called out by the research team: “as of June 2025, when the first paper related to the project, was uploaded to Arxiv, the preprint service, it has not yet been peer-reviewed, thus all the conclusions are to be treated with caution and as preliminary”)

So what does this all mean?

I think both China’s and our approaches have something valuable to offer. China is focused on future skills and career readiness. The U.S. is focused on ethics, fairness, and critical thinking. Personally, I believe students should be allowed to use AI in schoolwork, but with the right guidance. We should be learning how to prompt better, double-check results, and combine AI tools with our own thinking.

AI is already part of our world. Instead of hiding from it, we should be learning how to use it the right way.

You can read the full MIT Technology Review article here
Washington’s official AI guidance for schools (published July 2024) is here (PDF)

— Andrew

4,811 hits

How I Published My STEM Research in High School (and Where You Can Too)

Publishing as a high school student can be an exciting step toward academic growth and recognition. But if you’re anything like me when I started out, you’re probably wondering: Where do I even submit my work? And maybe more importantly, how do I avoid falling into the trap of predatory or low-quality journals?

In this post, I’ll walk through a curated list of reputable STEM journals that accept high school submissions—along with some honest thoughts from my own publishing journey. Whether you’re writing your first paper or looking for your next outlet, I hope this helps.


📚 10 Reputable Journals for High School Research (Especially STEM)

These are ranked loosely by selectiveness, peer-review rigor, and overall reputation. I’ve included each journal’s website, review cycle, and key details so you can compare.

  1. Columbia Junior Science Journal (CJSJ)
    Selection Rate: ~10-15% (very selective)
    Subjects: Natural sciences, engineering, social sciences
    Peer Review: Professional (Columbia faculty/editors)
    Cycle: Annual (6–9 months)
    🔗 cjsj.org
  2. Journal of Emerging Investigators (JEI)
    Selection Rate: ~70-75%
    Subjects: Biological/physical sciences (hypothesis-driven only)
    Peer Review: Graduate students and researchers
    Cycle: Rolling (7–8 months)
    🔗 emerginginvestigators.org
  3. STEM Fellowship Journal (SFJ)
    Selection Rate: ~15-20%
    Subjects: All STEM fields
    Peer Review: Canadian Science Publishing reviewers
    Cycle: Biannual (4–5 months)
    🔗 journal.stemfellowship.org
  4. International Journal of High School Research (IJHSR)
    Selection Rate: ~20–30%
    Subjects: STEM, behavioral, and social sciences
    Peer Review: Author-secured (3 academic reviewers)
    Cycle: Rolling (3–6 months)
    🔗 ijhsr.terrajournals.org
  5. The Young Researcher
    Selection Rate: ~20–25%
    Subjects: STEM, social sciences, humanities
    Peer Review: Faculty and researchers
    Cycle: Biannual (4–6 months)
    🔗 theyoungresearcher.com
  6. Journal of Student Research (JSR)
    Selection Rate: ~70–80%
    Subjects: All disciplines
    Peer Review: Faculty reviewers
    Cycle: Quarterly (6–7 months)
    🔗 jsr.org
  7. National High School Journal of Science (NHSJS)
    Selection Rate: ~20%
    Subjects: STEM and social sciences
    Peer Review: Student-led with academic oversight
    Cycle: Rolling (3–5 months)
    🔗 nhsjs.com
  8. Journal of High School Science (JHSS)
    Selection Rate: ~18%
    Subjects: STEM, arts (STEAM focus, quantitative research)
    Peer Review: Academic reviewers
    Cycle: Quarterly (4–6 months)
    🔗 jhss.scholasticahq.com
  9. Curieux Academic Journal
    Selection Rate: ~30–40%
    Subjects: STEM, humanities, social sciences
    Peer Review: Student-led with professional oversight
    Cycle: Monthly (fast-track: 2–5 weeks; standard: 1–3 months)
    🔗 curieuxacademicjournal.com
  10. Young Scientists Journal
    Selection Rate: ~40–50%
    Subjects: STEM (research, reviews, blogs)
    Peer Review: Student-led with expert input
    Cycle: Biannual (3–6 months)
    🔗 ysjournal.com

🧠 My Experience with JHSS, JSR, and NHSJS

1. Journal of High School Science (JHSS)
This was the first journal I submitted to on November 13, 2024. The submission process was straightforward, and the portal clearly tracked every stage of the review. I received feedback on December 29, but unfortunately, the reviewer seemed unfamiliar with the field of large language models. The decision was based on two Likert-scale questions:

  • “The paper makes a significant contribution to scholarship.”
  • “The literature review was thorough given the objectives and content.”

The first was marked low, and the second was marked neutral. I shared the feedback with LLM researchers from top-tier universities, and they agreed the review wasn’t well-grounded. So heads up: JHSS does have a formal structure, but you may run into an occasional reviewer mismatch.

2. Journal of Student Research (JSR)
Originally, I was going to submit my second paper here. But I ended up choosing NHSJS because JSR’s review timeline was too long for my goals (6–7 months vs. NHSJS’s 3–5 months). That said, JSR has one of the clearest submission guides I’ve come across:
👉 JSR Submission Info
If you’re not in a rush and want a polished process, it’s a solid option.

3. National High School Journal of Science (NHSJS)
This is where I published my first solo-authored research paper (see my earlier post). What stood out to me:

  • Quick response times
  • Detailed and constructive reviewer feedback

My reviewers gave me 19 major and 6 minor suggestions, each with specific guidance. It was incredibly helpful as a student navigating scientific writing for the first time.

That said, the journal’s submission format was a bit confusing (e.g., its citation style is non-standard), and the guidelines weren’t always followed by other authors. I had to clarify formatting details directly with the editor. So: highly recommend NHSJS—just make sure you confirm your formatting expectations early.


Final Thoughts

If you’re serious about publishing your research, take time to explore your options. The review process can be slow and sometimes frustrating, but it’s one of the best ways to grow as a thinker and writer.

Let me know if you have any questions. I’d be happy to share more from my experience.

— Andrew

Happy New Year 2025! Reflecting on a Year of Growth and Looking Ahead

As we welcome 2025, I want to take a moment to reflect on the past year and share some exciting plans for the future.

Highlights from 2024

  • Academic Pursuits: I delved deeper into Natural Language Processing (NLP), discovering Jonathan Dunn’s Natural Language Processing for Corpus Linguistics, which seamlessly integrates computational methods with traditional linguistic analysis.
  • AI and Creativity: Exploring the intersection of AI and human creativity, I read Garry Kasparov’s Deep Thinking, which delves into his experiences with AI in chess and offers insights into the evolving relationship between humans and technology.
  • Competitions and Courses: I actively participated in Kaggle competitions, enhancing my machine learning and data processing skills, which are crucial in the neural network and AI aspects of Computational Linguistics.
  • Community Engagement: I had the opportunity to compete in the 2024 VEX Robotics World Championship and reintroduced our school’s chess club to the competitive scene, marking our return since pre-COVID times.

Looking Forward to 2025

  • Expanding Knowledge: I plan to continue exploring advanced topics in NLP and AI, sharing insights and resources that I find valuable.
  • Engaging Content: Expect more in-depth discussions, tutorials, and reviews on the latest developments in computational linguistics and related fields.
  • Community Building: I aim to foster a community where enthusiasts can share knowledge, ask questions, and collaborate on projects.

Thank you for being a part of this journey. Your support and engagement inspire me to keep exploring and sharing. Here’s to a year filled with learning, growth, and innovation!

Blog at WordPress.com.

Up ↑