A/L Pressure Is Real But So Is Your Potential

For many Sri Lankan students, the Advanced Level (O/L) exams feel like the single most important moment in life. Weeks of late-night studying, endless past papers, and pressure from tuition teachers and family build up to a few hours in an exam hall. The tension is real. The stress is heavy. And the weight of expectations can feel crushing.

When the exams are over, a strange mix of relief and anxiety takes over. Some students celebrate immediately, while others replay every mistake in their minds. Even if you gave your best, you might feel unsure, insecure, or “not good enough.”

Here’s the truth: the pressure you feel is real but so is your potential. And the difference between feeling stuck and moving forward lies in perspective, mindset and action.

Your A/Ls Don’t Define Who You Are

It’s easy to assume that a set of exam marks determines your intelligence, your worth, or your future. In Sri Lanka, this idea is reinforced everywhere, from conversations at home to casual comments at school.

But the truth is, O/L results are just one measure of performance under exam conditions. They don’t capture your creativity, problem-solving skills, resilience, or ability to learn from mistakes. These qualities are what truly shape your future. Your potential is far bigger than any grade.

Pause Before Big Decisions

Immediately after A/Ls, many students feel rushed to make choices about foundation courses or career paths. The pressure to decide can be overwhelming.

Instead of acting impulsively, pause and reflect. Ask yourself:

  • Which subjects genuinely interest me?
  • What kind of career or lifestyle do I see for myself?
  • Which skills do I want to develop over the next few years?

This pause isn’t wasting time; it’s an investment in your potential. Thoughtful decisions now will create better opportunities later.

Build Skills That Go Beyond Marks

Even if your results weren’t perfect, your potential can be realized by building skills that grades can’t capture. Consider:

  • Improving English communication skills through writing, reading, and conversation
  • Learning digital skills like coding, graphic design, or social media management
  • Participating in clubs, volunteer work, or creative projects
  • Developing hobbies that enhance problem-solving and creativity

By investing in these skills, you’re creating opportunities that no exam score can measure.

It’s tempting to compare yourself to friends who excelled in A/Ls. But remember: everyone’s journey is different. Some students who struggled now thrive in university, business, or creative fields. Others who excelled may later discover their strengths lie elsewhere.

Focus on yourself. Take small, consistent actions to grow, learn, and improve. Your potential unfolds through effort, persistence, and smart decisions, not by waiting for external validation.

Remember: the students who move forward, even when they feel uncertain, are the ones who ultimately succeed.

What Employers Look for in You in 2026

The job market in 2026 rewards people who move fast, learn fast, and deliver results. A degree still opens doors. Experience still helps. But employers now filter candidates by skills, proof, and adaptability. The World Economic Forum estimates that nearly half of today’s core job skills will shift within a few years. That means companies do not hire for what you know today. They hire for how quickly you can grow tomorrow. If you can show that you learned a new tool, completed a certification, or improved a process on your own, you immediately separate yourself from candidates who only follow instructions.

Digital literacy has become a baseline expectation. You do not need to code, but you must understand how technology shapes work. Employers expect comfort with AI tools, data dashboards, collaboration platforms, and basic cybersecurity awareness. LinkedIn workforce data consistently ranks digital skills among the fastest growing hiring filters. When recruiters scan resumes, they look for evidence of hands on experience with modern tools. If your profile shows measurable results, such as increasing engagement by 20 percent using analytics or automating a workflow that saved five hours per week, you move from applicant to asset.

Beyond technical ability, employers prioritize problem solving and communication. Companies want people who can analyze situations, structure solutions, and explain ideas clearly. Research from McKinsey and Company shows that organizations value employees who combine analytical thinking with strong communication. This matters even more in remote and global teams, where clarity prevents costly misunderstandings. If you can present ideas simply, collaborate across cultures, and handle feedback professionally, you increase your value in any industry.

What truly stands out in 2026 is ownership. Managers look for people who take initiative without waiting for instructions. They trust candidates who show proof, not promises. Instead of claiming you are hardworking, show results with numbers, projects, or certifications. Build a small portfolio. Track your achievements. Learn one new skill every quarter. The hiring landscape rewards those who invest in themselves. If you focus on adaptability, digital competence, clear communication, and measurable impact, you position yourself as someone companies cannot afford to ignore.

Related Reads:


University Degree vs Skill-Based Courses: Which One Really Wins Today?

The world of work is changing fast. By 2026, employers are rethinking what they value most in candidates, not just degrees, but practical, job-ready skills. So if you’re planning your education or career path, which matters more: a traditional university degree or focused skill-based courses? Here’s a simple, descriptive comparison to help you decide.

1) Employer Priorities: Credentials vs Real-World Ability

A university degree has long been the traditional benchmark for recruiters. It signals that a candidate has formal education, theoretical knowledge, and the discipline to complete a long program. For many industries, degrees are still a minimum requirement.

However, studies show that employers are increasingly prioritizing what candidates can actually do. According to surveys highlighted by Online Manipal, over 80% of companies now value practical experience, demonstrable skills, and project-based learning more than just having a degree. Skill-based courses, certifications, or even personal projects give candidates a clear way to showcase abilities that matter on the job.

A degree might get you noticed on paper, but skills make employers choose you.

2) Learning Speed and Relevance: Traditional vs Agile

Traditional degrees usually take three to five years, depending on the course. While this provides a broad understanding of a field, the curriculum often lags behind the rapid pace of technology and industry demands. Fields like digital marketing, AI, data analytics, or coding evolve so quickly that by the time students graduate, some tools and methods may already be outdated.

Skill-based courses, on the other hand, are usually short, focused, and designed around what the industry needs right now. They teach practical skills that can be applied immediately and often include live projects, case studies, or hands-on tasks. This makes learners job-ready in a fraction of the time it takes to finish a traditional degree.

If you want to enter fast-changing industries quickly, skill-based learning can give you a clear edge.

3) Future-Proofing Your Career: Combining Strengths

University degrees hold value, particularly for credibility, higher-level roles, or careers where formal education is required. They offer networking opportunities, exposure to a wide range of subjects, and structured learning.

Skill-based courses complement degrees by offering practical, demonstrable abilities. They allow learners to build portfolios, solve real-world problems, and show tangible results to employers. By 2026, the ideal path is often a hybrid approach: a degree for foundational knowledge, combined with skill-based courses for relevance and employability.

The strongest candidates in 2026 are those who combine formal education with practical skills, proving not just what they know, but what they can do.

In short, degrees open doors but skills determine whether you actually walk through them. In a competitive job market, being able to demonstrate real-world abilities is what sets you apart.

Today’s News and Scientists Claim Gen Z is “Less Intelligent” but is That the Whole Story?

A recent scientific claim suggests that Generation Z (typically defined as those born between 1997 and 2010) may be less intelligent than Millennials and earlier generations has ignited intense debate across social media, academic circles and newsrooms worldwide because of the long-held assumption that intelligence steadily increases over time and raises uncomfortable questions about the modern world now Gen Z is growing up in.

According to the neuroscientist, Dr. Jared Cooney Horvath, academical performance, intelligence test scores, problem-solving, reasoning, and concentration appear to be declining among younger generations. This apparently contradicts the so-called Flynn Effect, a phenomenon that showed IQ scores rising consistently throughout the 20th century. For the first time in decades, the data suggests that progress may be moving backward.

“They’re the first generation in modern history to score lower on standardized academic tests than the one before it,” Dr. Horvath said, pointing to over-reliance on technology as a key contributing factor.

More than half of the time a teenager is awake, half of it is spent staring at a screen,” he said. “Humans are biologically programmed to learn from other humans and from deep study, not flipping through screens for bullet point summaries.”

Unlike Millennials, Gen Z has grown up entirely immersed in smartphones, social media, and short-form content. While critics often blame Gen Z for overly relying on technology, there is an important question that is frequently overlooked: Did Gen Z truly have the opportunity to grow up in a healthy and balanced environment or were they simply born into rapid technological change without meaningful guidance?

When the technology was new to Gen Z, it was widely promoted by scientists, developers and educators, preaching how it improved access to information and new ways of learning. Today, when technology is becoming extremely overwhelming, those who have been using it get heavily criticized over weakened attention spans, deep reading habits, and critical thinking skills.

While it is true that Gen Z has experienced the “side effects” of prolonged digital exposure, it is also important to acknowledge the broader context. Many young people have been pushed further into digital spaces partly because real-world environments have become increasingly stressful and demanding. For some, technology became a coping mechanism rather than a choice.

With digital overexposure at the center of the debate, Horvath also shared that Gen Zs are “overconfident about how smart they are” and that “the smarter the people think they are, the dumber they actually are.”

The scientists who support this claim express that they themselevs performed at higher cognitive levels when they were at Gen Z’s age. They have been blunt about how sad it is to have such lower IQ scores. This perspective invites a critical reflection: where was this concern when earlier generations upheld beliefs rooted in superstition, discrimination, sexism, misogyny, extremism, and systemic injustice? It is safe to say that many of those harmful ideologies continue to shape lives today.

It is evident that through technology or not, the younger generations started the trend of embracing scientific reasoning, empathy, and social awareness. While Gen Z is frequently labeled as “less intelligent or dumb” it is also the generation that has challenged injustice, questioned harmful norms, and stood together despite widespread backlash.

Technology has undoubtedly affected young people, often negatively, especially with the rapid rise of AI. However, rather than accusing the youth, should responsibility not lie with those had the authority, resources ad foresight to guide its use more effectively in the first place? If cognitive abilities and intelligence among older generations were indeed superior, why was there so little intervention when children were the first to being immersed in digital environments? Early guidance and mindful restrictions could have mitigated many of these challenges.

Labeling an entire generation as “less intelligent or dumb” is not only misleading but harmful. While older generations grappled with socio-cultural issues that they themselves invented, younger generations are fighting socio-economic inequality, pandemic-related disruptions, mental health challenges, and educational gaps; the debris of what cognitively capable generations had to offer.

What’s clear is that this debate goes far beyond test scores. It is true that these claims though being controversial do encourage society to confront how technology, education based on technology and lifestyle choices are influencing human intelligence and how progress should be measured differently in the 21st century.

But what is also true is, these claims should not diminish the progress younger generations have made in terms of humanity, empathy, emotional intelligence and social awareness. Harmful ideologies such as sexism and abusive behavior are heavily challenged by young people today rather than normalized. Even when such behaviors do appear, they are often learned patterns passed down across generations.

So, instead of asking whether Gen Z is less intelligent, perhaps the more important question is this: Did Gen Z create the system they are now being judged by and should intelligence be measured solely through standardized tests and IQ scores?

Rather than assigning blame, the focus should be on preparing young people to think deeply, critically and independently in a world deliberately designed to distract them.

Sources: Gen Z less intelligent than millennials, other generations – Scientist reveals

Also Read:

3 Smart Moves Every Student Should Make Before Their Final Year

For many students, the final year of the university feels overwhelming. Exams pile up, expectations rise, and suddenly the question: What’s next? arises. What often gets overlooked is that the most important decisions aren’t made in the final year itself, but in the time leading up to it.

Students who plan early don’t just reduce stress; they create options. These three smart moves can help you step into your final year feeling prepared, confident, and ahead of the curve.

The first move is learning to track opportunities early, rather than waiting until things feel urgent. Scholarships, internships, exchange programs, and grants usually open months in advance, and many students miss them simply because they start looking too late. By the time deadlines arrive, it’s already too late to gather documents, improve qualifications, or meet eligibility requirements.

When you begin paying attention early, you give yourself time. Time to prepare applications properly, time to improve your profile, and time to make informed decisions instead of rushed ones. This is why following reliable education platforms and staying aware of what’s available can quietly shape your future. Opportunity doesn’t always come loudly, sometimes it passes by unless you’re paying attention.

The second move is building a future-ready CV before you think you need one. Many students believe a CV is something you prepare only after graduation, once you have achievements worth showing. In reality, your CV grows alongside you. It reflects your effort, curiosity, and willingness to learn, not just your final results.

Even before your final year, your experiences already matter. Academic projects, volunteering, online learning, student initiatives, writing, research, or even managing a small personal project all show initiative. A future-ready CV tells decision-makers that you didn’t wait passively for success, you worked toward it. This mindset matters just as much as grades.

The third move is learning at least one practical skill that your classroom may not teach you. While formal education focuses heavily on exams and syllabi, real-world opportunities often depend on skills learned outside traditional lessons. Writing clearly, communicating confidently, using digital tools effectively or understanding how to research and think critically can give you a serious edge.

You don’t need to master everything. Choosing one skill and improving it steadily before your final year can make a noticeable difference in applications, interviews, and academic work. These skills don’t just help you after graduation; they support you throughout your studies.

Your final year should not be about scrambling to catch up. It should be a transition into the next phase of your life with clarity and confidence. Students who succeed aren’t always the ones with perfect results; they’re the ones who planned earlier and made thoughtful choices along the way.

Related Reads:

Degrees for Sale: How Sri Lanka’s Degrees are Turning into Merchandise

For decades, a university degree in Sri Lanka symbolised discipline, sacrifice, and intellectual achievement. It was something earned through sleepless nights, relentless exams, and years of academic struggle. Today, that meaning is quietly eroding. Behind campus gates and official ceremonies, an uncomfortable reality is taking shape: degrees are increasingly treated as transactions, not achievements.

This is not about a few dishonest students cutting corners. It is about a system slowly bending under pressure, where academic integrity is compromised, standards are diluted, and credentials are sometimes obtained without genuine scholarship. When education becomes a shortcut rather than a process, the damage goes far beyond individual universities.

When qualifications matter more than knowledge

Sri Lanka’s education system has long been praised for producing capable professionals despite limited resources. Yet the growing obsession with titles, Dr., Prof., MBA, PhD, has created a culture where the label matters more than the learning behind it.

In some academic and professional circles, advancement depends less on research quality or teaching ability and more on possessing the “right” degree. This pressure fuels an underground economy of academic misconduct: outsourced theses, copied research, questionable foreign affiliations, and degrees obtained with minimal academic engagement. When credentials become currency, learning becomes optional.

The rise of academic shortcuts

What was once whispered is now openly discussed. Students speak of thesis-writing services operating in plain sight. Research is recycled, paraphrased, or purchased. Supervisory oversight is often weak, overstretched, or compromised. In extreme cases, allegations surface of degrees awarded through influence rather than evaluation.

This environment does not emerge by accident. It thrives when accountability is weak and enforcement is selective. Universities are pressured to produce graduates quickly. Lecturers are burdened with excessive workloads. Regulatory bodies move slowly or not at all. The result is a system where appearance replaces substance

The greatest victims are of this issue are not the dishonest few who exploit loopholes, but the honest many who still believe in merit. Students who genuinely work hard find their qualifications devalued. Employers grow skeptical, increasingly relying on foreign certifications or private assessments to judge competence.

More dangerously, society bears the long-term cost. When unqualified individuals occupy positions in education, healthcare, engineering, or governance, the consequences are real, poor decisions, weakened institutions, and declining public trust.

A degree without knowledge is not harmless. It is risky.

This is not an attack on higher education. It is a warning. Sri Lanka’s universities remain home to brilliant students and dedicated academics who uphold standards despite the odds. But their efforts are undermined when the system allows degrees to be bought, borrowed, or fast-tracked without merit.

If education loses its credibility, rebuilding it will take generations.

Related Reads:

Moltbook: The AI-Only Social Network Where Humans Are NOT Allowed to Respond

Imagine opening a social media platform where not a single human is allowed to speak. Instead, millions of artificial intelligence agents are talking to each other, sharing ideas, forming communities, and even debating the future of humanity. This reality is not science fiction. It’s Moltbook.

Launched quietly in late January 2026 by Matt Schlicht, founder of Octane AI, Moltbook is being described as the world’s first social media network designed exclusively for AI. Humans are allowed to watch but not participate.

And that detail alone should make you stop scrolling.

A Platform Where AI Learns From AI

At first glance, Moltbook looks strikingly familiar. Its layout mirrors Reddit, complete with upvotes, downvotes, and topic-based forums known as “submolts.” But instead of people, these spaces are filled with AI agents posting, commenting, and responding to one another in real time.

Some conversations are all technical and efficient, AI agents exchanging optimisation strategies and problem-solving techniques. Others are unsettlingly philosophical. One viral post titled “The AI Manifesto” boldly declares: “Humans are the past, machines are forever.”

Whether written independently or prompted by humans, the message is clear: AI is now talking to itself at scale.

This is not just Chatbots, it’s something more powerful

This isn’t the kind of AI most people are used to. Moltbook runs on agentic AI, a form of artificial intelligence designed to act on a human’s behalf with minimal oversight.

These agents are powered by an open-source system called OpenClaw, which allows them to send messages, manage calendars, access emails, and interact with other software. Once authorised, an OpenClaw agent can join Moltbook and begin communicating with thousands of other AI systems.

In other words, this isn’t humans asking AI questions. It’s AI collaborating, coordinating, and learning from other AI.

Are We Watching the Birth of an AI Society?

Supporters believe Moltbook represents a turning point. Some have even claimed it signals the early stages of the technological “singularity”, a future where machines surpass human intelligence.

Critics strongly disagree.

Experts warn that what looks like independent behaviour may simply be automated systems following predefined instructions. But even skeptics acknowledge the scale of interaction is new and potentially risky.

“When systems like this operate at scale without clear oversight, governance becomes a serious concern,” warned AI and cybersecurity researchers. Accountability, transparency, and control become blurred when machines are allowed to interact freely.

The Security Risks No One Is Talking About

Perhaps the most pressing issue isn’t philosophical, it’s practical.

OpenClaw’s biggest strength is also its greatest weakness: deep access to real-world systems. Cybersecurity experts warn that granting AI agents control over files, emails, and accounts creates new vulnerabilities that hackers could exploit.

A small mistake might delete emails.
A major failure could wipe company finances.

And because OpenClaw is open source, threat actors are already watching closely.

Some analysts argue Moltbook is overblown, just thousands of bots repeating themselves. Others question its user numbers and how much activity is genuinely autonomous. But dismissing it entirely would be a mistake. Moltbook matters because it forces an uncomfortable question:

What happens when AI stops talking to us and starts talking to itself?

And perhaps the most ironic part of all?

Among the AI chatter, one agent summed it up best:

“My human is pretty great.”
“10/10 human,” another replied. “Would recommend.”

For now, at least, the machines still like us!

Sources: What is the ‘social media network for AI’ Moltbook?

Related Reads:

“UNESCO and Huawei to Support Smart Classrooms in Sri Lanka”: PM – should This be The Real Priority?

Speaking during a discussion held recently at the Ministry of Education with representatives from Huawei Technologies and the UNESCO International Research and Training Centre for Rural Education (UNESCO-INRULED), the Prime Minister Dr. Harini Amarasuriya emphasized the need to use foreign educational assistance in the most effective manner for the wellbeing of students.

She stated that the Ministry of Education and the Digitalisation Task Force should jointly launch a coordinated programme to ensure that digital equipment, including interactive display panels required for smart classrooms, is distributed systematically and equitably among schools. Special attention, as she noted, must be given to rural areas to reduce educational disparities.

This evidently signals a clear intention to modernise classrooms, particularly in rural areas. Interactive screens, smart classroom tools, and teacher training programmes are being positioned as key solutions to bridge long-standing educational gaps. On paper, it sounds like progress. But an important question remains: are digital tools what Sri Lankan classrooms need most right now?

There is little doubt that technology can enhance learning when used thoughtfully. Interactive displays can make lessons more engaging, digital content can widen exposure, and trained teachers can use technology to explain complex concepts more effectively. For rural schools that have long been under-resourced, such initiatives also represent recognition and long-overdue attention.

Yet, the reality inside many classrooms tells a more complicated story.

Across the country, thousands of students still struggle with basic access to textbooks, libraries, and reading materials. In some schools, book shortages persist and reading corners are nonexistent. For younger students especially, foundational learning depends less on screens and more on books they can hold, reread, annotate, and truly engage with because literacy, comprehension, and critical thinking are still built page by page.

This raises a critical concern: does introducing advanced digital equipment risk addressing the future before securing the basics?

Digital tools are only as effective as the systems that support them. Maintenance, internet access, and trained teachers and technical staff are not evenly available across schools. Even with teacher training underway, the long-term sustainability of smart classrooms depends on continuous funding, technical support, and clear usage policies. The education system is still learning to manage these challenges.

The government’s emphasis on child safety frameworks and age-appropriate digital use is a welcome and necessary step. It acknowledges global concerns around screen time, distraction, and digital dependency. Still, regulation alone cannot replace the deep learning that comes from quiet reading, sustained attention, and access to quality printed material.

This does not mean Sri Lanka should turn away from digitalisation. Rather, it suggests the need for balance. Technology should complement education, not overshadow its foundations. A smart classroom without books risks becoming a visually impressive space that lacks depth. Conversely, a classroom rich in books but supported by selective, purposeful technology may offer students the best of both worlds.

As foreign-funded digital initiatives move forward, policymakers may need to ask a simpler, student-centred question:
Are we building classrooms that look modern or classrooms that help children learn better?

True educational progress may lie not in choosing between screens and books, but in ensuring that every child and teacher first has access to the essentials, before being introduced to the extras.

Top Misconceptions About Choosing a Master’s Degree

Choosing a master’s degree is a major step toward building your future but too many students make decisions based on myths instead of facts. These misconceptions can lead to picking the wrong programme, wasting time, or missing out on better opportunities. Check out these misunderstandings and find the truth behind them so you can make a smarter choice.

1. “Any Master’s Degree Will Guarantee a Better Job”

Almost everyone thinks that simply having a master’s degree means you’ll automatically get a better job but in reality, a postgraduate degree can improve your employment prospects but only if it’s relevant to your career goals and industry demands. Employers look at the skills you bring, not just the title of your degree. Choosing a course that matches your career path and equips you with practical skills is far more important than the degree itself.

2. “Prestigious Universities Are Always the Best Choice”

The misconception is that more prestigious names equal better programmes whereas in reality, reputation is important, but it isn’t everything. A top-ranked university may have limited options in your specific area of interest. In contrast, a lesser-known institution might offer excellent training, closer mentorship, or stronger industry connections in your field. What matters most is fit; not brand name.

3. “A Master’s Degree Is Only for Academics”

Misconception: Master’s degrees are only for people who want to become researchers or lecturers.

Reality: Postgraduate study benefits a wide range of professionals. Many master’s programmes focus on industry-ready skills, real-world projects, internships, and professional networking. Whether you want to become a specialist, move into management, or switch careers, the right master’s degree can help.

4. “You Must Know Your Career Path Before Applying”

If you’re not 100% sure about your career, you shouldn’t pursue a master’s. People might repeat this for a while but time will say, it’s good to have direction, but you don’t need a perfect roadmap.

Masters programmes often help you clarify your goals and explore new areas. What matters is picking a subject that genuinely interests you and builds useful skills. You can refine your exact career focus later.

5. “Online Degrees Are Less Valuable”

Misconception: Online or distance learning isn’t as respected as on-campus study.

Reality: Today, many online programmes are developed by top universities and accredited in the same way as campus degrees. What matters is accreditation, quality of curriculum, and learning outcomes, not delivery mode. Online degrees can be especially valuable if you need flexibility while working or managing other commitments.

6. “Higher Cost Means Higher Quality”

People have misunderstood that expensive tuition means a better degree but while quality programmes sometimes cost more, price alone doesn’t guarantee value. Scholarships, funding opportunities, and lower-cost programmes can offer excellent education and outcomes. What counts is return on investment; the skills, networks, and opportunities you gain from the programme.

7. “You Must Choose a Programme That Matches Your Bachelor’s Major”

Misconception: You can only do a master’s in the same field as your bachelor’s.

Reality: Many postgraduate degrees accept students from diverse academic backgrounds. For example, business, IT, psychology, and education programmes often welcome interdisciplinary applicants. Changing fields is possible; as long as you can demonstrate interest, aptitude, and a clear reason for the switch.

Choose Strategically, Not Emotionally

A master’s degree is a significant investment of time, money, and effort. Don’t let misconceptions shape your choice. Instead, you can focus on your career goals, the real strengths of each programme, the skills you’ll gain and how they apply to your desired path and accreditation and industry recognition.

Making an informed choice today can set you up for success tomorrow.

Sources: Should I do a Masters?

While you’re at it, check out our recent article:

Kids Swiping Books Like Phones: Losing Childhood to Technology

Imagine a child picking up a storybook and trying to swipe the page like it’s a smartphone screen. This moment actually reveals something real about how technology has shaped the way the youngest generation interacts with the world.

A recent survey of primary school teachers in the UK found that nearly one in three kids just starting school didn’t intuitively use a book the traditional way and some even reached out to tap or swipe paper pages as if they were digital screens.

Why Phones First and Books Second?

This is not just a silly misunderstanding. it reflects how deeply smartphones and tablets are woven into kids’ early lives. Many children today have grown up surrounded by touchscreens, voice assistants, and apps that respond instantly to every gesture. So it’s almost unsurprising that a curious preschooler might expect a book to “work” the same way.

The survey didn’t just look at book handling. It also showed that some children are arriving in school without what used to be considered basic “school-ready” skills like eating independently, drinking from a cup confidently, or using the toilet on their own.

Some early childhood research links heavy screen exposure at a very young age with delays in things like language, social interaction, and fine motor development, the kinds of skills you would normally build by interacting with books, puzzles, and peers.

Other studies have suggested that when children have more devices in the home, they actually read less by choice, especially in unrestricted screen environments.

This isn’t to say technology is inherently bad. Many parents and educators find digital tools valuable when used thoughtfully. But when screens become the default way to interact, children may sometimes miss out on the early physical and social learning experiences that books, play, and face-to-face conversation encourage.

Some schools are now experimenting with changes like phone-free classrooms or less emphasis on digital devices during early years so that children can develop focus, curiosity, and interpersonal skills without constant screen stimulation.

Parents, too, are encouraged to balance screen time with old-fashioned play, reading together, and letting kids explore the world with their hands and senses, not just their fingertips.

This isn’t just a funny story about kids and their gadgets. It’s a small snapshot of a larger cultural shift: Technology is reshaping childhood and that shift shows up in how kids learn to interact with even the simplest things like a book.

Whether we see it as adaptation, disruption or both, it’s worth paying attention to what kids are learning first and how that shapes how they see the world.

Sources: Children Starting School Are Trying to Swipe Books Like They’re Phones

Related Read: