Episode Transcript
[00:00:01] Speaker A: Hi, I'm Maggie.
[00:00:01] Speaker B: And I'm Nicole. Welcome to the DAC Dyslexia and Coffee Podcast. We're so happy you could join us. We're both moms and dyslexia interventionists who want to talk about our students and children. What dyslexia is, how it affects our kids, strategies to help and topics related to other learning disabilities will also be covered in this podcast. Parents are not alone, and we want to give voice to the concerns and struggles we are all having. This is a safe place to learn more about how to help our children grow and succeed in school, in the world. Grab a cup of coffee and enjoy the conversation.
[00:00:36] Speaker A: Hi, everybody. We're going to start with the concept of the week. So concept of the week is our opportunity as practitioners to kind of peel back the curtain a little bit and talk about topics that directly relate to what we do. So things that kind of insider baseball section of our podcast.
So today's concept of the week has to do with statistical data, and it is the idea of reliability.
So this is a term that you hear when you're talking about standardized testing, which was our topic for last week.
The reliability of a test is measured by how consistent the results are for a given test. So in other words, a test is reliable if it gives the same results after repeated administrations under similar conditions.
So a real world example is your bathroom scale. Right. So if you step on the scale and you see a certain number, if you step back off and. And then step back onto that scale, it should give you that same number a second or a third time.
Right. Cause you're not changing anything about it. You're not saying in the morning or at night, you're doing it under the same conditions. And so it should be giving you the same number. And if it's not, that is not a reliable scale.
[00:02:08] Speaker B: None of them are reliable.
[00:02:11] Speaker A: Well, that's a topic for a different time.
[00:02:17] Speaker B: So our topic today is how to read an assessment report.
[00:02:21] Speaker A: Yes, we're really kind of excited about this one because we get asked this a lot and we put a lot of time and effort into our assessment reports, frankly. So I'm excited to talk about. Okay, how do you read this assessment report? How do you get the information out of this report if you are not a practitioner?
[00:02:46] Speaker B: Exactly. Because these reports are really long. I'm gonna be very frank about that. We make them long.
[00:02:53] Speaker A: Yes.
[00:02:54] Speaker B: They have lots of numbers and details.
The language used in these reports is usually dense and is full of jargon.
[00:03:03] Speaker A: Yes.
[00:03:04] Speaker B: And it's usually aimed at Physicians or psychologists.
And not always with the parent in mind.
[00:03:11] Speaker A: Exactly. I mean, here at dac, we try our very best to satisfy a wide audience.
[00:03:18] Speaker B: Correct.
[00:03:19] Speaker A: And one of my favorite things about the way we do our assessments here is that we really do think about the parent when we write these reports. And so we want to communicate directly with them or the adult that we may have just done this assessment on. Right. Who we may have found has dyslexia. So now how do we. How do we explain that to them in a written format, which we already know is probably going to be hard for that person. So we do spend a lot of time here trying to address those kinds of things.
[00:04:01] Speaker B: Correct. But that doesn't always happen. So sometimes you get a report like this and it doesn't have much narrative explanation or. And it might just have the data tables.
[00:04:14] Speaker A: These reports often use clinical terms. Right. Like poor, very poor, below average.
These terms could be a bit shocking to parents and they can be really emotionally difficult because it directly relates to our kids. We are going to talk specifically about those terms a little bit later in the podcast and kind of give them a more narrow definition.
But that can be a really hard thing when you are getting a report.
[00:04:51] Speaker B: Correct.
[00:04:52] Speaker A: Without much explanation.
[00:04:54] Speaker B: Yes.
Generally an accepted order for how results are presented. So most reports will start with a cover letter that summarizes the reason for the assessment and then when and where the assessment took place, as well as who performed the assessment.
[00:05:13] Speaker A: And it will include ideally, that person's signature and their qualifications. Right?
[00:05:19] Speaker B: Correct.
[00:05:20] Speaker A: Why are you qualified to be giving this assessment in the first place? That should be on that cover letter.
[00:05:28] Speaker B: Right.
And generally, next on the assessment report, there's background information. That's that history that we collect.
I really encourage parents to read through this section because sometimes we make mistakes and there's incorrect information on there, or there may be spelling errors or missing information. Or maybe you left and you thought, oh, I forgot to tell them X, Y and Z. Yes, we can always add that. And especially at our yes.
Organization, we will definitely revamp it.
[00:06:04] Speaker A: Yeah, we will amend it. I do that.
I do that fairly often, actually, where a parent will say, actually, I kind of feel like I want this included. Or I actually kind of feel like, you know what, can we actually leave that detail out? I don't think it's relevant. And yeah, I will respect.
Most of the time, I defer to the parent on most of the things that they are correct. Expressing to me in that background section, the only thing I would not do is like misrepresent background information.
[00:06:42] Speaker B: Correct.
[00:06:43] Speaker A: But that's, you know, nobody's usually gonna ask you to do that, but can happen.
[00:06:52] Speaker B: And then after the background sections, there's usually the different assessments organized in different ways. So typically there's a chart with data and then narrative below the chart that hopefully explains the numbers and what they all mean.
[00:07:10] Speaker A: If there has been kind of an informal assessment given that usually does not include a data table. Correct. But it might have a narrative description. There might be some numbers associated with that. Like, for example, 26 letters of the Alphabet. Right. Like it might say 24 out of 26.
So that data might be in a chart, but there's not going to have that statistical measure there.
[00:07:48] Speaker B: So, Maggie, what are the numbers that are in a chart typically?
[00:07:52] Speaker A: So typically, if it is a standardized norm referenced test, it's going to have a raw score for each subtest that was given. The raw score is the score that the student got on that subtest that really cannot be interpreted alone. That will vary widely.
And it's literally just how many that student got correct on that subtest. Correct.
Percentile rank is something we talk about a lot throughout the assessments, and they are included in a data table. A percentile is used to compare this student with their same age or grade peers. When the score falls into the percentile.
Right. Like that same age grade peers average is 50. So the average is the 50th percentile.
And then the average range is typically considered between the 25th and the 75th percentile. Those of you who got to close your eyes and remember your high school statistics class.
Okay.
That is when you talked about the normal curve. Right. Or the bell curve. When we are Talking about the 25th to 75th percentile, that is the fat part of the bell curve. That is what's considered statistically average.
[00:09:39] Speaker B: Correct. Although sometimes in some of our tests we put low average for 25 to 37 percentile. Because in research, sometimes being in that range of 25 to 37th percentile, that can decrease with more complex content.
In other words, the gap tends to widen as students progress through school or through life.
[00:10:08] Speaker A: Exactly.
[00:10:09] Speaker B: Right.
[00:10:10] Speaker A: Makes sense if you're looking at a skill set.
[00:10:13] Speaker B: Right.
[00:10:13] Speaker A: That's in that low average range.
And we have done no intervention there.
That will decrease.
[00:10:22] Speaker B: Correct.
[00:10:23] Speaker A: Over time.
[00:10:24] Speaker B: Exactly.
And 25 to 75 is a huge range.
[00:10:29] Speaker A: That is a huge range. Yes.
So we also will include a standard or scaled score. That depends on usually it's like whatever the publisher provides us.
Some publishers will do both. Some will have either standard or Scaled, it really depends on the testing measure.
So usually a standard score, they are very similar to a percentile where they are comparing them to their peers. But the average, the very, very middle part of the bell curve for a standard score is 100.
For a scaled score, the average, the middle number is usually 10.
So it really, again, it depends on the assessment. But both standard and scaled score refer to a normal bell curve.
[00:11:31] Speaker B: Correct.
And then we get to the age and grave equivalence. Most of the tests have these in our center. We don't always include these unless the assessment is based on it, mainly due to the confusion about these numbers. And sometimes they're not as accurate as the other data that we can grab out of the tests. It also often does not actually reflect the student's overall abilities and they cannot be used to accurately monitor, sorry, progress.
[00:12:08] Speaker A: Exactly. Age and grade equivalence are not at all the same thing as age and grade based norms. Correct. The way that grade and age equivalence are calculated is that the test compares the student's performance to the average performance of students at other either ages or grade levels.
So for example, to kind of make this more concrete, let's say the average score of a first grader on one specific test is 50 points. The student that I'm testing scores 50 points regardless of their own grade. This could be a 10th grader, regardless of their own grade. This measure would say that this student's performance is comparable to a 51st grader's performance.
[00:13:08] Speaker B: Correct.
[00:13:09] Speaker A: I feel pretty strongly about this topic.
I never include age and grade equivalents on these tests unless a physician specifically asks for them because they are not accurate, they're not a good reflection. And as someone who has a lot of experience on the high school level, these measures often carry a very heavy emotional weight.
They don't actually mean that this kid is, you know, a first grade ability. But that is how they are taken.
And I have never seen them lead to a good outcome.
[00:14:03] Speaker B: A test will also likely include a descriptive term based on the standard score in the percentile rank.
[00:14:10] Speaker A: Yes. So this is what we were kind of talking about before that. Here's where a very common descriptive term, background is very poor, which is less than 5th percentile. Yes, poor.
[00:14:29] Speaker B: Between the 5th and 9th percentile, below average, 9th to 25th percentile, and then below average, 25th to 37th percentile. Then we have our average range, 37th to 63rd percentile.
[00:14:44] Speaker A: Then we have high average 63rd to.
[00:14:46] Speaker B: 75Th percentile, above average, 75th to 91st percentile, and then superior 91st to 100, which I don't see that very often.
[00:14:59] Speaker A: I sometimes do on our memory measures.
[00:15:02] Speaker B: The memory. Yes.
[00:15:04] Speaker A: I've also seen it in our listening comprehension tests.
[00:15:10] Speaker B: But it's still not very, very.
[00:15:12] Speaker A: It's not very often. Think about if you're really like closing your eyes and picturing that normal curve, the reason it tails off. Right. Is that like, that is not where most people lie.
[00:15:28] Speaker B: Yeah.
[00:15:28] Speaker A: Right. If we're talking about that big old bell curve and we're using percentiles, we're saying, let's say there are 100 people in this room.
If you are the 91st percentile, you have scored as good or better than 91 of those people.
So yes, we don't often see that superior score.
And in a true. If we were. If Nicole and I worked in a setting that we were actually most of the time testing average students, we would also not see the very poor range very often.
[00:16:11] Speaker B: Correct.
[00:16:11] Speaker A: Our data is skewed because the people who come to us for assessments typically are actually struggling in reading. So Nicole and I see poor or very poor scores more often than the test would actually represent.
[00:16:27] Speaker B: Correct.
These descriptive terms are only meant to give meaning and context to the numeral data.
Sometimes when a student's score falls in the poor or very poor range, these words can evoke strong emotional response from parents or even the students.
[00:16:45] Speaker A: Yes.
[00:16:48] Speaker B: So as a parent with kids that have fallen into these areas, it is very hard to see that.
[00:16:56] Speaker A: Yes.
[00:16:58] Speaker B: And to know that that to have to be very aware that that's one subtest of your child. It's not the whole pict. Your child.
[00:17:08] Speaker A: That's exactly right. This is. This kid scored in the poor or very poor range on this one set of skills. That does not mean that overall your child has poor or very poor abilities.
The reason we do include those descriptive terms is literally just to give context as based on the test itself, they are generally easier to interpret than something like a percentile rank or a scaled score.
[00:17:41] Speaker B: Right.
And doctors look at.
[00:17:44] Speaker A: For those doctors do look at them and so do school professionals when they are making decisions based on does the student meet criteria for an IEP or 504 or not. They will look at those, not only the statistical data, but also those norm referenced descriptive terms.
So we do include them.
They are not subjective. They are objective terms that are used by the publisher. Right. So like, these are not like our terms that we've decided.
[00:18:25] Speaker B: Yes. They're actually in the assessment manual of, based on the numbers that are there, what we're supposed to use. So that's why we use them, because that's what we have to do when we're doing an assessment like this.
[00:18:39] Speaker A: Yes.
[00:18:39] Speaker B: You use what the publisher has in included in the diagnostic criteria.
[00:18:46] Speaker A: Yes.
[00:18:50] Speaker B: So some assessments also give you a nice graph that shows the standard or scaled scores that we include in the report. And this is a really nice way to give you that visual representation of the student's strengths and weaknesses.
[00:19:07] Speaker A: Yeah, I always like to include those in the reports because it's. You will see dots above that average line, dots below it. And you'll see that. Yes. When we actually map this out, this all students, because this is how human brains work, all will have some areas of real strength and some areas of real challenge. And it helps you really see that that's true.
[00:19:41] Speaker B: And then the report should have a conclusion and this section should come out and specifically state whether or not the data suggests a student has dyslexia or not.
It should actually be written down and.
[00:19:54] Speaker A: It should really come, like right away. These reports should not leave you hanging, folks.
We state, kind of the first line of our conclusion is whether or not we think the student has dyslexia.
[00:20:11] Speaker B: And sometimes we even bold them so that if somebody's just skimming the report, they can just get to that piece and look at it and then go back and see why.
[00:20:23] Speaker A: Yeah, that's exactly right.
The way that we word our reports specifically here, one, we will say the data is consistent with the diagnosis of dyslexia. That means, yes, we do think this student has dyslexia.
We might say this data is not consistent with the diagnosis of dyslexia. That means we do not think this student does have it or the data is inconclusive. If we are saying the data is inconclusive, we go on and explain why we were not able to make a clinical decision.
That is a case by case situation.
[00:21:14] Speaker B: Right. And I would say most often with that one, it's a really little kid.
[00:21:19] Speaker A: Yes.
[00:21:20] Speaker B: And could it be that they were just not exposed yet to certain things versus do they have dyslexia at that age? It might be hard to tell based on their history.
[00:21:32] Speaker A: Yep, exactly. And so then we would also go on to give recommendations. Here's what the next steps are.
We also might recommend a retest. Usually we'll say give it a year or two years and come back. Come back, don't not get intervention.
[00:21:53] Speaker B: Right.
But if you're seeing issues and it's inconclusive, still go for intervention.
[00:22:01] Speaker A: Still go for intervention. But we just can't At a clinical level, make a call whether or not it's dyslexia.
[00:22:10] Speaker B: Correct.
And then a good conclusion would also include the definition of dyslexia, either from the International Dyslexia association and the Diagnostic Statistical Manual, the dsm.
[00:22:22] Speaker A: Yeah, our report does include both of those definitions, and it's pretty important that we're representing that correctly.
[00:22:36] Speaker B: Yes, for sure.
And also, if there's anything else that is recommended, like maybe we think that there should be another test given or maybe we feel it's a different area that might need testing that would also be in the conclusion.
[00:22:56] Speaker A: Yes, that conclusion. We would talk about what areas are impacted, since there are so many.
We would talk specifically, do we think it's the phonological awareness, phonological memory? Do we think it's rapid omnatic naming?
What is speaking to us as far as written expression? And then any other language areas? Sometimes we'll see receptive language also being in a deficit. And that always kind of makes us go, I wonder if there could be more.
More to this kid that. Yeah, we think it's dyslexia. And also.
Correct. We think that more testing on certain things is.
Is recommended. Something pretty common that we see often is we think there may also be adhd.
And we. We may come out and say that in the report. We're not going to say we think this kid has ADHD because that's not what we looked for today.
[00:24:04] Speaker B: But we might say may benefit from further testing for. And then fill the blank in because, you know.
[00:24:13] Speaker A: Right. And we would say why too. Right. This executive functioning seemed to be impaired. Sequencing seemed to be impaired. Right, Right. This kid literally could not sit at all throughout the assessment.
[00:24:31] Speaker B: I think the other one we refer a lot to is speech therapy. If we notice a language concern like articulation or there's something else that is flagged. Some of the assessments we do use have a language component, and sometimes if those areas are low, then we would recommend further testing.
[00:24:53] Speaker A: Yes.
So, yeah, I think Nicole and I were kind of. We're excited about doing this episode because this is a big part of what we both do here. And I think I like sharing with our listeners what. What goes into these reports.
What are we thinking about as a. Right. Somebody who's giving a diagnosis. What are we thinking about when we're giving these tests and why we really take it very seriously.
When we're telling a parent whether or not we think their kid has dyslexia or not, that's a really deep responsibility.
And we both Take it very seriously.
[00:25:52] Speaker B: I would say our whole assessment team does.
[00:25:55] Speaker A: Absolutely.
[00:25:56] Speaker B: We have a whole team, I think we said in the last episode, and we make sure everything is evidence based and we use each other to bounce ideas off of.
And the reason we put it's consistent with is because really a medical doctor is the only one who can put it on the diagnosis list. So that's why these assessments have so much data on them, because the doctor wants to see that data and to be able to say, yes, I'm going to be able to put this on your diagnosis list, which is what we recommend for students if they want accommodations in high school and college. Because you need to show that has started in childhood, which is why we do the big background history also.
And you also have to be able to show that it's been a consistent issue throughout academics.
[00:26:50] Speaker A: Yeah. You know, often these assessments serve as a beginning paper trail for the student and they do end up following these students for a very long time.
[00:27:06] Speaker B: So definitely keep it in a safe place. Yes. We get calls a lot from 10 years ago of where can you find my assessment? And I'm like, I wasn't even here then.
[00:27:19] Speaker A: Yeah, well, that's really important to note too because actually in compliance with HIPAA, no, we don't have that from 10 years ago. We need to shred them after seven. So. So no, but.
[00:27:34] Speaker B: But we still get the calls. So just so you know, they might be needed when. Especially if you're going to go for a test to. For grad school, they seem to need more of the paper trail for those tests. So we have had people have to come in and get retested and then we have to document so much of their history based and they have to give us all their academic stuff. And then we have to compile a report to show that they had intervention throughout their academic history. So if you have this, it's good to keep it in a safe spot so that you can bring it out when you need it.
[00:28:21] Speaker A: Yeah, it's some.
You know, as someone who was a high school special education teacher, my job was transitioning students out of the public schools. And I. They would come to me already with a pretty thick file, most of them, and it would have been data from kindergarten. And it was actually one of actually my favorite parts of that transition job because I got to see it in so many cases, like, wow, this kid really is kind of the same kid, but just older or whoa, this kid has made this crazy amount of progress.
So I guess what we're saying is make sure that we are keeping these assessments in a safe place, and they are going to be relevant for this, for this kid for a long time.
[00:29:26] Speaker B: So, Maggie, what's going on beyond dyslexia?
[00:29:29] Speaker A: Let's see. So something kind of fun.
We went to the zoo last weekend. Not last weekend, but the weekend before.
And for those of you not in southeastern Wisconsin, we went to the Milwaukee County Zoo and they have a new octopus.
[00:29:49] Speaker B: Really?
[00:29:50] Speaker A: Actually fairly recent.
And he was really active.
[00:29:56] Speaker B: Okay.
[00:29:56] Speaker A: We went on a day where the weather was, like, kind of crappy, so not a lot of other people wanted to be at the zoo, but it was, like, empty.
So we got to get up to the glass. He was playing with a ball, really? And then he also had, like, a puzzle toy, and he was manipulating this puzzle toy with like two or three of his tentacles and then, like, wrapping his other tentacles around.
It was the coolest thing I've seen in a really long time.
[00:30:31] Speaker B: That is pretty cool.
[00:30:33] Speaker A: Yeah.
[00:30:33] Speaker B: I might have to go to the zoo soon.
[00:30:34] Speaker A: It was very, very cool.
Highly recommend.
[00:30:42] Speaker B: Kind of this, a little bit of similar thing. We went to the Milwaukee Public Museum over the weekend. And, you know, I've been there so many times.
[00:30:53] Speaker A: Yes.
[00:30:54] Speaker B: And it's a very old, established museum, I think. Didn't they say like 18 something? It was developed.
[00:31:01] Speaker A: I think so, too. Like, it's first charter. I think, like, it's moved buildings a few times, but.
[00:31:06] Speaker B: But yeah. Yeah. So it's been pretty similar my entire life, but they're actually building a new one and it looks like a spaceship, which is really weird to me. And I'm kind of sad because I really like.
[00:31:20] Speaker A: I like.
[00:31:22] Speaker B: I like some of the displays and I hear that they're going to be changing a little bit. But I was really excited because we ended up staying four hours, which for my kids is pretty amazing.
[00:31:33] Speaker A: Huge.
[00:31:34] Speaker B: It is huge. And, you know, they all had fun and they all liked learning and it was really weird because that's not normally what they want to do.
[00:31:45] Speaker A: That's like the dream.
[00:31:46] Speaker B: I know, right?
[00:31:47] Speaker A: The one time it happened. Just mark it down.
Put that in the folder where you put there, like educational.
[00:31:55] Speaker B: Yes.
[00:31:56] Speaker A: And never, ever let it go.
[00:31:57] Speaker B: Yeah, that's for sure. It happened.
[00:32:01] Speaker A: Yay. That's amazing.
Well, thank you, everybody, for listening.
If you do like our show, please follow us on social media. Reach out if you have any questions or would like us to discuss a topic. If you do like our show, please give us a rating on your favorite podcast player. This is how we reach more listeners and we get to help more families. Thank you, everybody.
[00:32:25] Speaker B: Thank you.