Reading PET/CT reports and lab values became a way to navigate uncertainty—and showed me how patient-facing information systems can change the balance of power in care.
In many clinic visits with my mother, I felt like we were always one step behind: waiting for a doctor to interpret a scan, waiting for someone to call, waiting for news that might or might not arrive. MyChart changed that—not by solving everything, but by changing who could see information and when.
For the first time, I could see parts of what clinicians saw. And that changed how I thought about agency, data, and design in healthcare.
When my mother finished a PET/CT scan or a major lab panel, she often could not wait calmly for the next appointment. She worried. I worried. The days between tests and visits stretched out, full of questions we did not yet know how to ask.
Through MyChart, I could:
• see imaging and lab results as soon as they were released • read the radiologist’s impressions • track changes over time • prepare questions in advance
I did not understand everything at first. I looked up unfamiliar terms, re-read reports, and slowly built my own mental model of what “stable,” “progression,” or “response” meant in her case.
This did not replace the doctor. But it changed what it meant to walk into the room.
Before, clinic visits felt like events where information moved in only one direction: from clinician to us. With MyChart, visits became conversations I could prepare for.
I would arrive with:
• specific time points highlighted (“This value changed here.”) • patterns I noticed (“Why did this marker spike and then fall?”) • clarifying questions (“Does this wording mean progression or just artifact?”)
My mother relied on me to interpret the information, and I relied on MyChart to see it early enough to make sense of it.
Caregiving became a kind of applied data analysis—reading, comparing, asking, and translating clinical language into something my mother could live with.
What MyChart gave us was not certainty. It was clarity.
Clarity looked like:
• being able to see that a scan was stable before hearing it aloud • realizing that a lab abnormality had already begun to resolve • confirming that a frightening symptom did not match radiology findings • seeing follow-up appointments and referrals in the system
Information did not remove fear. But it contained it.
As I read more reports, I learned to distinguish between:
• alarming language vs. routine phrasing • true progression vs. cautious wording • expected side effects vs. red flags
This informal, self-taught literacy changed how I supported my mother emotionally. I could say,
“The scan shows no new lesions. Let’s ask them about this one area, but overall, it’s stable.”
That kind of sentence is not just data—it is a different emotional reality.
From a design perspective, MyChart is a patient-facing interface to the EHR. It sits between:
• clinical documentation • internal messaging • scheduling systems • lab and imaging pipelines
But from a lived perspective, it is:
• a way to track patterns • a place to revisit past conversations • a partial window into clinician thinking • an anchor during long treatment cycles
It made me think about:
• which fields are exposed to patients • how language is rendered • what information is delayed or withheld • how interface design affects understanding
My experience with MyChart motivates my interest in:
• patient-facing health information systems • information clarity and visual design • how caregivers interpret raw clinical text • how portals could better support multilingual families
I do not see MyChart as a finished tool. I see it as an early version of something that could:
• highlight trends visually • provide context-sensitive explanations • support multilingual overlays • connect patients’ questions with clinicians’ documentation patterns
For me, MyChart was not just a website. It was a bridge between data and lived experience. And it showed me how much power, and comfort, lie in being able to see.