Applying fundamentals of science to sport: threats, challenges, solutions
This webinar was recorded on 12 June 2020 by Dr Blake McLean from the Human Performance Research Centre. Sign up to receive notifications about future events.
Blake McLean:
... today's webinar, a couple of housekeeping things I want to run through before we kick off. We are recording this, just so people can check in and watch the video if they're not with us live. So just so that everyone's aware that we're recording this. If you have any questions throughout the webinar, down the bottom there's a question and answer box, so please flick your questions through in that question and answer box, they'll come through to me, and I'll feed those through to Aaron and Franco as we go throughout the webinar. So please feel free to throw those through at any time and I'll feed them through.
Blake McLean:
To kick off today I'd like to acknowledge the Gadigal People of the Eora Nation upon whose ancestral lands our campus now stands. I'd also like to acknowledge the elders, both past and present, and acknowledge them as the traditional custodians for knowledge of this land. So thank you to Aaron and Franco for joining us today and giving up their time to come and chat to us about all things sport science and human performance. To start with, just going to launch into a little bit of background. And we'll start with you Aaron, if you can give us a little bit about the program here at UTS and how this topic has evolved into one of the things we're really passionate about here at UTS, and how you and Franco tie into that.
Aaron Coutts:
To be honest Blake... Well, welcome everybody. Thanks for tuning in. One of the reasons that we did today was over lunchtime conversations we have here when we were allowed to come to work, every day over lunch we'd have discussions about, is sport science broken and obviously we've had our track record here of doing applied research and research in partnering with sport, and we've developed the Human Performance Research Center.
Aaron Coutts:
And I suppose we thought with COVID on, and we normally have a symposium this time of year, that we could put this webinar on and everybody else could join in our discussions, because these are just discussions, but I think it's really worth understanding the threats and challenges that exist. Obviously, I've been here at UTS for 17 years, but recently we've had a quick growth and we recruited a few people to our team. And one person is the guy on the panel with us, Franco.
Aaron Coutts:
And I've worked with Franco for probably 20 years almost now, and he's definitely informed the way I go about my work. He's passionate about research method. Passionate about doing high quality research in sport science and I thought, "Well rather than us just benefiting from it, let's have a chat with Franco, and we can discuss about the issues that we see." And I'm sure the people who are listening have read Franco's tweets, if not papers? I'm not sure which is more popular at the moment?
Aaron Coutts:
But clearly want to convey some of the issues we've seen and work we've done around some of the load monitoring metrics, some of the work we've got coming out in athlete wellness measures, and the issues that are occurring and we're seeing. And Franco and I have, and the whole group of us actually, robust discussions around threats and challenges to sport science. Hey Franco, do you want to give a bit of a background to your perspectives on this?
Franco impellizeri:
Yes, I try. Thank you very much for this initiative. I think it's potentially very useful. Yeah, that's true, we spend a lot of time arguing each other during lunch. I think the main reason is because you speak so fast that I don't understand really, and you don't understand me because of my Godfather English. So that's probably the main reason.
Franco impellizeri:
But joke apart, yeah, I think the reason why I like you to know more about research methods is because basically, I'm a very insecure person, so I try to face this problem, trying to read more. So there is a psychological reason why I'm so passionate about research methods, because I'm afraid to do mistakes, or better, more than doing mistakes, I'm afraid to do too much mistakes. And that's the main reason.
Franco impellizeri:
So, of course, what I understood over years is that, to solve problems, the first point is to recognize the problems. And that's why sometimes I try to underline some issues we have in sport science, because if we agree that there are some issues, maybe we can find a solution together. That's the main reason. And I'd like to say that I joined social media because you suggested me to join social media, because I wasn't even on Twitter. So it's your fault. If you're not happy now, now it's too late. So learn from your mistakes the next time, yeah?
Aaron Coutts:
Yeah, I do lose some sleep over your tweets, there's no doubt about it. There's a bit of a slide there Blake, I don't know if you showed a slide about what Franco, how he envisions our background to this, around is sport science broken? It sort of depicts our lunchtime conversations, and I'm usually the one getting belted by Godzilla, because Franco, I might-
Blake McLean:
This is just an image of what Franco thinks he looks like while he's on Twitter. [crosstalk 00:05:26] I don't know which one you are in this?
Aaron Coutts:
... I'm the one losing usually. But so more seriously, I tend to take an optimistic view. I think sport science research over the last 20 years has taken a big leap forward in terms of trying to bridge a gap and develop the evidence-based practice. But one of the issues in evidence-based practice is having good quality evidence, and I think that this is Franco's point, that some of the evidence that we've provided is pretty flimsy. Some of the methods we use have not been validated, and some of the conclusions and common thought are probably not supported by the robust methods and underpinnings. You'd agree with that Franco?
Franco impellizeri:
Yeah. No, I agree.
Aaron Coutts:
So what we thought today, we've got a list of things we'll talk about and just generally chat about. And so we've got this list that hopefully Blake can show up now that we're going to talk through. Just a general chat, rather than a presentation. And we thought some of the themes that cause threats to our profession or our science moving forward, because ultimately as applied sport scientists, we really want to... We're scientists. We have dipped our toe a little bit in the practitioner field, but fundamentally we're scientists. And some of the threats that occur is, well the first one we'll kick off with is, gurus or scientists, or understanding now our role.
Aaron Coutts:
And one of the things that I see occurring is that there's a lot of popularity that can arise and you can confirm your own biases, I suppose, by gaining popularity, and we need to understand our role is scientists, and therefore the level of evidence that applies to medicine should apply to our field as well.
Aaron Coutts:
And unfortunately, as I see, is some of the methods that we used, our methodology training has needed to be improved, and I've definitely learned that. If we look back at the papers we've done 20 years ago, I shake my head at some of the mistakes we've made. But the thing about being scientists, it continually improves and evolves. And I think our role is to internally reflect and understand that no, we need to continually improve, and to do that, I think there's a lot of things that play roles, but in particular is that the methodology training and the robust methodology.
Blake McLean:
I think Franco, one of the things that I've been learning from you is not throwing everything out, not saying like, "This is a bad method. This is a good method." But understanding the methods that we have, where the limitations are around them and starting to develop better and more validated methods which might tell us better information. I don't know, there's a few different areas you're working through that process in, did you want to talk to us about your perspective and thoughts there?
Franco impellizeri:
Yeah, you're right. The main point that maybe we discuss a bit better later, is that we need to be transparent. So it's not a question of limitations, all the studies have limitations, it's a question of acknowledging the limitations and considering the limitations when we deliver recommendations, suggestions, or when we write the conclusions of our studies.
Franco impellizeri:
So the difference between gurus and scientists in my opinion is exactly this, recognizing the limitations. So as a scientist, we have beliefs, because we all have beliefs and we have confidence on these beliefs. Our confidence is fueled by the evidence. When you transform your belief in something like faith, you're moving towards the direction of becoming a guru.
Franco impellizeri:
So it's a mental approach that differentiate a scientist to a guru. And as a consequence, also the information we deliver to the, lets say, end users or practitioners change accordingly. So yeah, improving the methodology is just a question of providing information that are commensurated to the limitations of the methods that we use to get [inaudible 00:09:41] in information.
Aaron Coutts:
I got a question here from Daniel Greenwood, who actually went to UTS many moons ago. He's saying are we talking about what's currently being done? Well obviously, it's easy to generalize. There's obviously good work being done everywhere, but there's also work that can be improved. And when I'm talking about the idea of loose standards, is it what we accept as fact? It's more about the standards of methodology.
Aaron Coutts:
One example that, I won't embarrass Blake, but I will. We did a project many moons ago, over 10 years ago, where we used this concept of athlete wellness, and at the time it was with good intention to have a questionnaire to measure wellness. And these questionnaires are popular everywhere now. But we just used something that was applied by a club without actually looking at whether the methodology had been validated using accepted psychometric validation measures. And some of the work that we're doing now is unpacking those tools.
Aaron Coutts:
So I suppose some of the looseness I allude to is even around assessing validity of some of the tools we use. Some of the designs. Some of the designs that we use in research are post doc, and if you look at the level of evidence, we tend to use lower level of evidence in supporting the facts that we developed. Particularly in the applied field. And again, I don't want to say it's all broken, that's not what I'm alluding to. Actually, I'm a bit more optimistic than even Franco. But it definitely can improve, and this is an area I think that we definitely want to work on as a group.
Blake McLean:
I think one of the points there, Aaron, is not like trying to develop methods just because they're scientifically robust. The point about them being scientifically robust is therefore that we can understand exactly what we're measuring a little bit better. And if we can understand what we're measuring, we can better understand the inputs and how to manipulate that, and how to change the response that we're getting from a group of athletes or individuals within that.
Blake McLean:
So it's like, if the tool is really blunt and not that valid, we kind of don't know what information we're getting back. And I know you're really passionate about that area, Franco, about understanding what the information means and then therefore, how we can utilize it.
Franco impellizeri:
Yeah, I think that it's not a question of creating methods. Methods are all out there. So we need just to maybe improve our background knowledge of these methods, and understanding when to use these methods. Because this is an argument I had with Aaron about if sport science is broken or not. But even if it's broken, it can be fixed, that's the point. So at the end of the day, maybe when I say sport science is broken it's a bit provocative, which is unusual for me, but it's just a way to warn about the errors that we make every day when we do our job.
Franco impellizeri:
As I said, I spent 10 years in clinical setting, so the first time I start to approach clinical studies, even if there are problems there of course, as in any discipline, I realized that there was a gap between the methods that we were using compared to the methods that I was starting to use. And so I realized, even 10 years ago, that we had a lot of room for improvement. So the idea that it's broken, okay, can be a bad idea, but I see the positive side of this perspective, which is that we can fix this.
Aaron Coutts:
And that's certainly, and this approach you've brought in has definitely improved the researchers we have here and our research students, that influence has been really strong. And I think we've responsibility now, us been doing this for nearly 20 years, of leading the discussion. Others are more than welcome to join in discussion, those other groups that internationally exist. But I think our responsibility now is to acknowledge we've all made mistakes, and we can improve things, and we need to evolve. And that's what science does, right? We evolve to answer questions and a big, big way is to address the methodology. Some of the other issues that exist is, oh that's a nice slide that you've got there Blake.
Blake McLean:
[crosstalk 00:14:14] special.
Aaron Coutts:
Franco made this one I'm sure. Is bias and how bias fits in. Obviously, it's really good, I had the one there of Burgundy, "Everything I find that agrees with what I think reminds me of how right I am." And that's easy to do. And we can find confirmation bias everywhere in our research. And I suppose, understanding that bias exists, and it exists in our designs, it exists all through the research process. Understanding first of all where bias exists, and acknowledging it is the really first step in improving the processes we're going through.
Blake McLean:
I think one thing we've spoken about there, Aaron, is that we all like to say that bias exists and we should acknowledge bias, but it's really not part of the culture of our industry, I feel. I don't know if you guys have different perspective, but we all like to think that, we kind of almost like to hide our biases rather than acknowledge that we have them. And it's seen that if you have a bias and you acknowledge it that that's a bad thing, as opposed to accepting that we all had our different biases and putting that out there and framing the information and communication in that context. But Franco, I know you have some pretty strong views on this, because how pervasive it is in the medical industry and your experiences there. So I don't know if you want to shed some throughs or a take from that perspective?
Franco impellizeri:
Yeah. As you said, we are all biased. There's no way to avoid the bias. The only thing we can try to do is to recognize and acknowledge that we are biased. In the end, most of the research methods have been developed to control some sources of bias, for example. But the problem is that we don't realize that we are biased, these fuel the fallacies, for example, in our arguments. So we start to provide bad arguments. And again, this is related to the guru versus scientist approach.
Franco impellizeri:
So as soon as we, in my opinion, we don't acknowledge the bias, and we don't recognize our biases, we are moving towards the direction of a guru, because also when we want to argument against other opinions, for example, we approach using pseudoscience for example.
Franco impellizeri:
So I think that it's important to acknowledge and to recognize the bias. And this can make a huge difference, because if we don't realize that, I think we will go on in providing information that are biased. And sometimes I read very nice papers that I think methodologically were very well done, and their statistic look good to me. But the conclusions were biased. And the conclusion that you write in the abstract is the message circulating around on Twitter, on social media. And this is a problem, because in the end, we popularize bias conclusions and bias opinions.
Aaron Coutts:
A big one there Franco is the practical application sections in papers. We're almost forced into writing practical application sections in papers now, and they're the catchy part. So most young researchers tend to write the abstract, the introduction and the practical applications or conclusions, right? Which is very, very dangerous. But we're also always forced or recommended to write three or four points of practical applications. Quite often, research doesn't have practical application, and then that's okay, but where you see the biases occur is where people write practical application that may be not supported by their data.
Aaron Coutts:
And that's where I see some of the misnomers, and they're all quite often then caught in social media, because that's what's captured in social media. And that's where I see some of the concerns of bias permeating from our research into practice. So absolutely, I think that's where the bias, translating that into the conclusion, practical applications, is where it most commonly is seen.
Aaron Coutts:
The part that we've talked about a bit is when bias occurs, when we see criticism against something like Session RPE, right? We always think they're always wrong. But that's a little bit our bias. But at least we have some evidence to support our position on that as a valid end metric.
Franco impellizeri:
Yeah, because she did that on science, and medicine, and football. With [inaudible 00:18:52] we decided to cancel in the instructions the requirements of a practical application section. So you're allowed to provide, if there are practical application exactly for the reason you just mentioned. Because we realized that we are forcing people to find practical application, for example, for descriptive studies. Of course, a descriptive study can give some practical application. But the practical applications are speculations.
Franco impellizeri:
And when you write practical applications in doc points, it's difficult to see that, or to interpret these, the co-messages as opinions basically. But the descriptive study is a heavy role in the scientific process for developing new ideas, generating new hypothesis and things like that. So it's fine. It's not something against the descriptive studies, it's something against the use of descriptive studies for statements that are too strong in relation to the strength of the evidence and strength of the results behind.
Aaron Coutts:
The best one is, right, this data can be used to inform training prescription.
Franco impellizeri:
Yeah.
Blake McLean:
On the practical applications, we've got a question coming in here, I just want to flip it on its head a little bit. And Lorenzo has asked, what happens when there's a method that's been used for a long time in practice. There's a lot of anecdotal feelings that this is a very good method, but there's limited evidence around it. How do we work through a situation like that? Because to do the types of papers and the studies we're talking about takes time, and there's a lot of pressure to get results now in high performance sport. So what's your guys' take on how to work through that?
Aaron Coutts:
Well my opinion is, this is where partnering, research and practice plays a role. We can't rush research. Researching take time. If you were to have seen Blake, the amount of effort Franco went to in looking at the acute to chronic workload ratio. It took him 12 months of, I'm saying 60 hours a week to go through that, and that's probably being undercutting the work that he did. But we can't rush these things. Definitely Lorenzo, we need to go and do the work, absolutely.
Aaron Coutts:
Now you can still continue using it, but use it at your own risk, [inaudible 00:21:23] accept that we don't know, but I definitely need to have that partnering and that fast and slow concept where we can partner with others. It's probably not the job of practitioners, per se, to create the evidence, to establish validity of any tools or measures, that's the role of scientists. But I think by working together we can identify what are important things to look at. But to understand our role in developing new methodologies is, it does take time, and I think there's risks in rushing it.
Franco impellizeri:
Yeah, I think that, as I said, that its a question of transparency. So you can go on using a strategy, a method, or whatever you want, knowing that this is a speculation. So for example, because in our job, and I mean Blake, you also train people. So when we design a training program it's plenty of exercises, strategies, or things that we don't have a strong evidence.
Franco impellizeri:
So the question is not to stop doing something without evidence, in reality what, for example, I'm trying to say often is that we should go on doing what we are doing, just acknowledging that what we are doing may be wrong. That's the only issue because if you are aware that what you are doing may be wrong, or can be optimized, or can be modified, you are more open to different perspectives, to new information. That's the point.
Franco impellizeri:
So I think that, of course, this is a slow thinking approach, but science, there's no other way to approach science. And I will say that you can see now in this emergency how dangerous is it to push too fast, science. Yeah, because this produce bad studies and over interpretation or wrong interpretations. So I think the slow and fast thinking approach can stay together as soon as you acknowledge that you are doing something that may be done.
Aaron Coutts:
And we've done this. The fine example is the one words questionnaires, right? We've had these discussions. I say, "But they're such useful tools." And you say, "But they're not valid." Which is fair enough. So what our solution to that is we're taking time. Once we've identified the issues is to go and identify, and develop valid methods. And that's the approach you take, it takes years to do. That doesn't mean we stop using the tools that we have, we just accept that the tools we have, whilst they're useful, they may not do as we think they do. And that's where, I suppose, the research comes in and that research process comes into play.
Blake McLean:
Yeah, this hits on a topic which I really enjoy listening to you speak about Franco, and that's acknowledging uncertainty and being comfortable with uncertainty in the scientific process. And I think sometimes the misconception is that science provides answers, and people want answers and want absolute. I feel, particularly in our industry, there's pressure on physical performance practitioners and sport scientists to bring more certainty. And also, sometimes people want to say that they have all the answers to it, so where does acknowledging, and being comfortable with, and working with that uncertainty fit in the scientific process?
Franco impellizeri:
You know well that I'm trying to sensitize people to accepting uncertainty. And I think our role as sport scientists is also to try to educate all the stakeholders to the uncertainty of the process. I link this answer to the question of Mike, that I see on the chat. Yes, of course, it's easier to communicate oversimplifying things and being a little bit more assertive. It's in terms of communication probably more effective, but it's wrong. And this is exactly guru style as you wrote. So I'm not saying it's easy, but I would be more careful, when we deliver information, in the terms we use and the way we deliver this informations.
Franco impellizeri:
I know that we don't have maybe the impact that we would like to have, and it's even more complicated when we have to challenge assertions, and statements, and claims from gurus, from professional gurus, but we don't have to transform us in stronger gurus to face them. I don't have a solution, of course, I'm just saying that this is something on which we have to work in understanding how to address this important issue. But the first point is to acknowledge that there's this problem of accepting uncertainty. I mean, uncertainty is everywhere. In medicine, the example I always propose is that when I start working with some doctors, they had a lot of information, and they had to make a decision if to operate or not, based on all this information. And these informations are not interpret in the same way by all the surgeons.
Franco impellizeri:
So out of five surgeons, maybe you have four deciding not to operate, and one to operate. I don't see the situation in sport science different. We have a lot of information and the decisions are personal, based on our beliefs, and this introduce uncertainty in the process. Because as soon as four coaches can decide four different things, it's likely that at least one of these are maybe not so good as the others. So we are introducing uncertainty.
Aaron Coutts:
I think practically Franco, where the confusion occurs, our role is to provide evidence to coaches, because usually the coaches are the ones that make the final decision. Particularly in the applied field. We need to convey that uncertainty to coaches behind closed doors perhaps, so they accept and understand the risk they're taking making that decision. But where the mismatch occurs is the coaches then got to go and stand out in front of players and say, "This is what we're going to do." They're not going to convey uncertainty, they've got to convey belief. And that's where I think some of the conflict occurs in this case. Because many coaches I work with, they understand the uncertainty, but they've got to go out and stand in front and say, "This is how it is."
Aaron Coutts:
Now, it's our role not to make that decision. It's our role, as I see it as [inaudible 00:28:23] to inform the decision. The coach then accept the responsibility, and obviously their job's on the line ultimately, because it reflects on the decisions that they make. So we need to understand where we fit in that decision making proves.
Blake McLean:
Yeah, I agree with that, but the coaching world is almost more complex these days, just because of the availability of information. So if you're to walk out in front of a group of athletes and say, "This is absolute. Like this is certain and we're going to go for this," you need to be careful with that because they might jump on Twitter 10 minutes later and say, "Hang on a sec, I just read Franco's tweet, and he says that maybe that's wrong."
Aaron Coutts:
Well we have the fine example Blake, your PhD in altitude training, right? So it's in conflict to how you're operating at the time. Altitude training wasn't part of your training. You're trained plain, your PhD inconveniently found that it didn't probably had the benefit that you thought. And that's a difficult thing to do, because your research could underpin the messaging and the belief in the organization. And that's the real challenge that we have to deal with. But our duty is to deliver the evidence. So make best available decision based on the strongest evidence available.
Franco impellizeri:
Yeah, we have to differentiate a bit the roles and the label of communication, because if we talk about communication between sport scientists and coaches is an issue, and from coaches to athletes is another. So this is a discussion we had several times. For example, when we discussed about the significance in statistical analysis. The idea to provide a sort of rule to decide when to act or not to act, is in my opinion the problem that we created. So we don't have to say to the coach, "Do it, or not do it." We just provide information. And after they decide if to do, or not. Also, because this is what they do every day, they have to make decisions. And of course, the decision is dichotomous, so or you use this training strategy, or you don't use it. But the information that we provide shouldn't be dichotomous as well.
Franco impellizeri:
So we can just provide the strength of the evidence. And so the strength of an approach or the weaknesses, the uncertainties and this information are used by the coach to decide what to do. So it's their job, it's not our job. Of course, this becomes a bit more complicated when we cover both roles. And this happened to me, when I had to train as a researcher, it's even more complicated. But in the end the only way I had to solve with this conflict, because it is conflicting when you do something, and you know that maybe it's not strong the evidence behind, but you did this for 10 years. It's difficult to change what you have done.
Franco impellizeri:
But I did this acknowledging that what I was doing is more tradition-led than scientific-supported, for example. And I use the example of the dead lift with my friends several times. I have no idea why, if they are really effective or not, but I like the dead lift. So I realized that I'm doing that because I like. And other reasons for which I do it because when people start to lift good weights they feel good, they feel [inaudible 00:32:02]. All these series of things. So I don't have a real scientific, let's say, physiological and bio mechanic rationale behind, but I did it for years and I had no problems with that. But that's the difference. So I'm approaching coaching, acknowledging and aware that what I'm doing may be wrong or not useful. So it's really a question of mental approach.
Aaron Coutts:
And some of the challenges you see is not so much dealing with approach, some of the threats that we're seeing is dealing with each other as scientists. Some of these things that we see, well we're not acknowledging these issues in our scientific work when communicating with other scientists. And that's where, I think, the major problem really lies.
Blake McLean:
We have a question coming through here from Jenny about how do we work through that in the publication process and better acknowledge biases and transparency through the publication and the peer review process, and what are some of the issues that exist around that? So now we're at the other end of the spectrum, we're talking about delivering it in coaching setting and understanding it in the first instance, from the evidence that we're talking about, there's also a complex web of issues to work through there.
Franco impellizeri:
That's a problem actually. Once we agree about the problems, we have to find a solution. And the review process is problematic, I know, because sometimes researchers try to hidden their limitations because on one side they're afraid that they expose too much the limitations, the paper can be rejected, for example. And this is fueled by the renewals that, when they see limitations, they tend to reject.
Franco impellizeri:
In my opinion, of course, unless there are serious problems, and I can say that it's most of the time, there are serious problems in the papers. But as soon as the study has been approached in a proper way, acknowledging the limitations is the minimum. So the paragraph about the limitations should be in all the papers.
Franco impellizeri:
But I have to say that sometimes I read the limitations of the papers, but they are not really limitations, or they don't acknowledge properly the limitations. And you see there in the limitations section if the background knowledge, in terms of methodology, is good or not in the researchers. Because there is lack for systematic radius or meta analysis. You can do everything properly; good analysis, good literature search, but maybe your conclusion are not balanced by the risk of bias assessment.
Franco impellizeri:
So they present the risk of bias. You see for example the risk of bias is high. Maybe they also show with the final plot that there is a publication bias, but in the end the conclusion is strong. So this strategy decrease the injury risk without considering that this statement is based on papers with a high risk of bias and publication bias, which means that, potentially, the strategy they're proposing may be even not effective at all.
Franco impellizeri:
So this is something that we should fix in the review process for sure. I don't know how to do that? But as I said, if we all, as sport scientists and all researchers start to recognize this is an important issue, since we are the reviewers of the papers, this can change a bit the process. So we are part of the process.
Franco impellizeri:
I complain about the process, of course, but on the other side, I try to give my contribution to this process. So I don't like people complaining, but when they are reviewers, because as an editor I see their review, they commit the same mistakes they are complaining. So this is a bit disappointing. So we are part of the solutions.
Blake McLean:
What about the role, maybe you can speak this out, the role of the editors, associate editors and the journals in that? And something that I feel like I'm seeing, and I think other people feel like they're seeing, but it's not so much out in the open, is bias of journals, in particular topic areas, and systematic [inaudible 00:36:39] that are happening at that level.
Aaron Coutts:
Yeah, this is a bit controversial, but we all see journals that are biased, because journals obviously get success from impact factor ratings, obviously popularity and social media following. And some topics are sexier than others. And you can see there's bias in selection of topics that are covered. And obviously, I can see there's a bias in doing that, and there's also a strategy in doing that. And I think that's a real risk, because it undermines the belief in the process.
Aaron Coutts:
Everybody has conflict of interest. Everybody has biases. The first stage, I suppose, is acknowledge them, and study them, and building in hardwired decision making tools to stop those from happening. Because by nature, sometimes you're unaware of them yourself. We've all made these issues, had these problems ourselves. So I think that's the role of the board as well an editorial board is giving feedback to a journal and not letting individuals dominate journals, for example, that's a real risk.
Aaron Coutts:
So I think they definitely exist. I think that because it's a popularity world, impact factor, journals want increased impact factors, it tends to get better papers submitted there. It's a real concern. So we have a role through the whole process, and we also have a role to identify these issues as either authors, as reviewers, editors, associate editors. Because no one's perfect, but what the journals do have the responsibility doing is listening to when they hear these issues being raised.
Blake McLean:
So a little bit onto the communication of some of this, and we've talked a lot about the medium, or we've just been talking about the medium of publications and papers. What about the role of social media in sport, in science? You have journals getting involved in tweeting and putting out opinions very quickly. So it's pervasive throughout from [inaudible 00:38:36] all the way up to a journal in the scientific process. So where do you guys see the balance of getting information out quickly, sharing opinions, but also communicating with responsibility and robustness to ideas, and thoughts and words.
Franco impellizeri:
Aaron?
Aaron Coutts:
Oh well, I think I thought Franco's going to take that on the-
Blake McLean:
Franco has to give a response to this question at some point. It's compulsory, based on your social media profile.
Aaron Coutts:
... Yeah, I absolutely think so. I think Franco's obviously encouraged the pre print approach because of how that can improve the process. However, I really think that you still need to go through the full peer review process. Although pre print can help us improve our papers as they go through the research process. And second part of your question, what was the second part Blake? The pre print was what I picked up on?
Blake McLean:
Just into social media, like where does social media fit? It definitely has a role. It's helping share information, but there's some risks to it as well.
Aaron Coutts:
Yeah, there's absolutely risks and you've really got to accept those risks, but depends on what you're using social media for. Obviously, my concern, and as I see Franco, is obviously perhaps the way you convey your messages. Sometimes intent can be lost in social media. But I know, for example, I don't want to make this all about Franco's tweets, but they are a big discussion and part of our focus, our discussion at lunchtime. I know where they come from, in terms of Franco, he's a real carer of improving the research process.
Aaron Coutts:
But we also have the responsibility though to give information out and educate the masses. And one way, beforehand you had to have privileged access to journals [inaudible 00:40:27] expensive, to get information. Where social media does have a real benefit is being able to get information out to the masses and early. It's just how you choose to use social media. I think that's a personal philosophy. My personal approach is, it's not a self promotion tool just to promote others, but it's also an information to get peer review, robust research out to the masses. What's your [inaudible 00:40:53] Franco?
Franco impellizeri:
No, I don't use a lot social media. So I cannot answer... No. No, social media is, of course, is a huge topic, joke apart. And I think we need to handle social media, and we need the social media in this phase of, let's say, sport science, but overall in our society. So we need to handle this instrument. I think this is a powerful instrument. To be honest, I learn a lot, because in my network for example you have a methodologist, statisticians, epidemiologists. I have access to the last pre prints, the last papers. There are very good threats about some of the methodological issues. So I benefit a lot of social media.
Franco impellizeri:
Of course, social media is very powerful, a very powerful instrument. And we need to learn to handle. I think we are in a phase where we are still not very good in handling social media. And of course, sometimes I look a troll, but I'm not a troll, I'm just delivering a message. I give you an example. The only reason why I was a bit stronger on some topics is because I wanted to, let's say, "scare" people. But I did that because scaring, that phase for me, was a way to slow down a bit the people jumping on the training law, the injury train.
Franco impellizeri:
So if you slow down a bit and you think more, even if you want to argue against me, it's fine. This push you in thinking more on what you are doing, because when you want to argue with people, you should support your opinions and your arguments. And to support them, you are forced to think more. If I arise some methodological issues and you want to argue with me about the methodological issues I underline, you need to address, methodologically, my comments.
Franco impellizeri:
And this force you in maybe reading more papers on that. Because the main problem we have in sport science, I know this can appear like a rant, but is that sometimes we jump on a topic, we'll read a couple of papers, and we think we are handling enough that topic. And this is what I have seen when I came back in the sport science world, for example, for questionnaires.
Franco impellizeri:
The single items in the questionnaires have been addressed very poorly in sport science. If we exclude, of course, the area of sport psychology, where, of course, they are much more familiar with psychometric and all these kind of methodological approaches, sport scientists just jump on this questionnaire topic without the right knowledge, committing very basic errors.
Franco impellizeri:
So measuring with one item a multi dimensional construct. In any, let's say, introductory course of psychometry, they explain to you that it's not possible, or it's very difficult. And so it's not enough to read a couple of papers. This requires a lot of time. So coming back to the social media issue, is that social media are a great source of these informations. If you want to address a new topic and you're having your network experts in that topic, you realize, following their discussions, how complex most of the topics that we think are easy in reality are.
Franco impellizeri:
So social media are very powerful. I think they are very useful, but I also think our main thread in this moment. We need to be careful, also when we write the abstract. We know that the abstract is likely to go on social media, and people usually don't read the paper, they just read the abstract. Sometimes they just read the title. So with the conclusion of the [inaudible 00:45:38] in my opinion, there should be more balance in terms of balance between the strength of the methodology use and the conclusion.
Blake McLean:
I mean, a recurrent theme... I've thrown this slide up here that you've put together Franco, the recurrent theme that I'm hearing coming through is that it's more about the process of thinking, and the answer to these questions is not just consuming information, which I feel like is quite pervasive in the way we think about traditional education and models. I go learn about a specific topic area, I become an expert in that area. The themes that are coming across to me from what you're saying is that it's more about the process of thinking and processing that information.
Franco impellizeri:
Yeah.
Aaron Coutts:
Most of us are trained with PhDs, right? And that's where we learn that critical thinking ability. That's where we've learnt through discovery in our PhDs and that's kind of the thing that we're trying to put in our new masters program as well. It's more to think and training your mind how to think, rather just consuming the facts. But obviously, we try to do it in the context of sport. But absolutely, that's the most powerful thing that we have throughout our higher education and our post graduate degree training I think, Blake.
Franco impellizeri:
Yeah, because I mean, trying to explain the research methods, the tools, it's relatively easy, you can find everywhere. So that's not the problem. The problem is to understand how, all the process before using a specific tool. So it's more conceptual. And the reason why we discuss a lot in these months because I'm struggling a bit in providing the contents for our new master. But the reason why I insisted that I wanted to add philosophy of science, critical thinking, logic is because of that. Because this provide background knowledge to be able to analyze a problem and provide, or even develop a proper research project, or to interpret the results more conceptually.
Franco impellizeri:
So we are, even in this chat we discuss about evidence, but what is an evidence? So when something is, let's say, an observation, effect, or whatever, that can increase our confidence on a specific theory. Because for us it's easy to say, "Yeah, there's evidence. Yeah, there's no evidence." But there isn't that theory of confirmation that where people and scientists spend one century in developing ways to understand when something is, or is not an evidence. And we don't know this area of these disciplines.
Franco impellizeri:
It's important that we know, that even a small term like [inaudible 00:48:56], a small term like evidence has a lot of, for example, epistemological load, and there are theories to suggest when something is, or is not an evidence. And that's important. And that's our role, since we are academics, to think how to teach to the future researchers to approach problems more conceptually and knowing a bit more about the philosophy of science.
Aaron Coutts:
That and applying also a good understanding of research design. Because as I've seen in the last, well particularly in the fields that I work in, is a lot of the work we've done has been post doc development of questions because available data that's been collected from convenient technology or tools that are available in the field, and then there mostly being a descriptive studies that are being used to make overly stated practical recommendations.
Aaron Coutts:
And I've made all those mistakes myself, but that's what I see some of the threats. We've been, particularly in the last decade, these type of issues occurring. We've got to go back to the foundations, and I think that fits back in our research methods training. So I want to shift in paradigm, I suppose, from more focus in the research methods, approach early career researchers in particular.
Franco impellizeri:
Yeah, because what I see honestly, is that we are skipping a step in the research process. So descriptive studies are very useful, of course, but they are useful because you can develop theories, new ideas, new hypothesis. And this is important in science. But once you have these descriptive results, you have to develop the theory, or a framework, or whatever and after you have to test this framework and this theory.
Franco impellizeri:
I'm aware that sometimes it's difficult to test the theory that you [inaudible 00:51:00] after using these descriptive studies. But it's important again that we acknowledge and we recognize that if we are proposing something that has been not tested, or the prediction of this new theory have been not tested, it's just a speculation.
Franco impellizeri:
So as soon as we are aware this is a speculation, it's fine even to develop practical application. So you can say, "Based on these results, we have this theory. And based on this theory, we propose this way to structure a training program." But most of these steps are just speculations, it's not evidence-based. To be evidence-based, these theories, they should be tested, they should be falsified in some ways. And I don't see this step. Most of the recommendation that are published are based on descriptive studies.
Franco impellizeri:
If we talk about, for example, injuries. Injuries is 95% of the informational base on the descriptive studies. Which is fine, as soon as we acknowledge and we say that the recommendations have been developed speculating around those descriptive studies and not selling as evidence-based. You know, I become mad when I see people saying, "This is evidence-based." Because it's not evidence-based, this is a speculation. It's not evidence. The evidence at best can support some links in your, let's say, framework. But most of the time these links have not be tested, they are really speculation. So this transform us as in something more close to a guru than a scientist.
Aaron Coutts:
I think the answer, Franco, comes in our training of the early career researchers. Retraining ourselves, of course, and commitment, continually improving. We all need to understand that research takes a long time to do. There is a threat of to publish or perish. Like researchers particularly in early careers, you see people want to publish a lot of papers rather than good papers. We can reflect now and say, "Well, it's the good papers that last. The cheap and easy ones don't last, and they actually probably do more damage than good."
Aaron Coutts:
And so we need to re-shift how our systems are focused on quality, rather than quantity. Another issue I think that we've taken up is having, we've discussed this, I think Daniel Larkins proposed the red team approach. So now any proposal we have, we try to take an antithesis approach as someone who is countering the research design, or countering the approach. I think that's something that's very valuable, particularly in the design phase of developing projects.
Blake McLean:
A little bit of a shift into some of the outcomes from the research that we usually do in sport. A good question coming in from Tannith here about a lot of the stuff we do is cohort-specific and we struggle to get large numbers and big population style data. So how do we take results from a lot of the work that we typically do, and does that just apply in our own setting, or can some of that be generalizable? Obviously, there's a balance, but what's your views on that?
Franco impellizeri:
Definitely the problem of all the studies actually, even in a well conducted randomized trial, understanding how much you can, the extent you can generalize the results is a big issue in all the fields, also medicine. I come back again to the same concept I'm saying. So as soon as you have results in a specific cohort and you want to apply to your players, you can do that, but if you know that the extension of the results doesn't ensure that you will obtain the same effect on your cohort.
Franco impellizeri:
So if I see something I think, "This is interesting, there is some evidence in that population I want to apply to my population." And this is where critical thinking also has a role. So I should try to understand what characteristics of my population can make, or can reduce the extension of the findings of the original population. Knowing that every time I extend the results to a different population, it may be, let's say, wrong. Or I may not find the same effect on my population. So, there's no way to have a, let's say, a scientifically valid way to extend the findings on another population, because in theory, you can't do that.
Aaron Coutts:
I think Franco, a lot of work's done in individual teams, and I think, I looked at Tannith's question, that they think, some people perceive this as competitive advantage in some of these studies. But I actually disagree. There's no competitive advantage if the studies are underpowered and if you don't have the appropriate design. So I think we need to take a long, slow approach to the answering of some of these questions, and make the effort to do the multi [inaudible 00:56:32] approach. It's not common sport science, and it's something we need to work towards. And it is difficult, but the issue of that then is getting funding to do that. So that requires a shift in our mindset around collaboration, and obviously a shift in funding bodies supporting these type of studies.
Blake McLean:
I think one of the competitive advantages there is the systems and the people, mainly the people, but a little bit of the systems as well, on-the-ground within the sporting organizations. And taking that critical thinking, and something we talk about a lot here at UTS is, how do we translate that into practice? And that's not like a simple process of someone did this in a study and I go and run that method. It's contextually specific, it's different. It might even be different within the same sport, different team, or same team from year-to-year that changes how you apply that.
Blake McLean:
But if we have systems which give you a combination of objective and subjective feedback and really strong communication throughout the organization, and then work out how that applies to, not only just your group of athletes, but individual athletes within that. To me, that's a competitive advantage, not just the method or the learning from a study. It's the people applying that.
Aaron Coutts:
That's the difference between the sport science researcher and the applied sport scientist, that we need to understand that they're very different. They compliment each other, but they're very different roles. So I agree completely, the philosophy of it has helped the organization, but the conclusions from those studies, I suppose, aren't at one point, aren't as strong as we tend to internally promote, usually.
Blake McLean:
Yeah, and I say the systems, because it's almost a little bit of that red team approach that you talk about Aaron, like if you can set up systems to challenge your bias and your belief in that moment, it doesn't mean that you're not going to do that, you might just stop a little bit and think about, "Okay, how can I evolve this, and is this the best program right now?"
Aaron Coutts:
This is a complex issue, right? [inaudible 00:58:33] getting belief in a program, because maybe some people don't like and have feedback that doesn't confirm their beliefs in what their doing. And that's a challenge when you're working in that environment. Some practitioners, coaches don't like to get the feedback that maybe this isn't the best approach, because they've got to get their athletes, support staff to believe in the approach. So that's a challenge.
Franco impellizeri:
Can I answer to a question?
Blake McLean:
Sure.
Franco impellizeri:
There is a question from Lauren about the quality of papers coming out that sometimes I agree is a bit frustrating. What I think is that the review process is just the first step, is just the first filter. It is not a definitive filter. So what we need to improve is the ability of people reading and interpreting those papers. And this is where, in my opinion, the role of sport scientist is important. So we cannot pretend that a coach has all the skills and abilities to understand the risk or bias of the papers. But nowadays in professional sports there's plenty of sport scientists. So we have a bit to understand what's the role of this sport scientist, and one of the role of the sport scientist is to be able to understand the strength of the papers that are published.
Franco impellizeri:
So I agree that it's difficult and sometimes, maybe more than sometimes, there are papers that are weak. But in theory, this shouldn't be a concern, because when I read a paper, I should have the ability to understand if it's weak or not. And when I communicate the results, or let's say, I prepare a report for the members of the team, I should be able to work, understand if a paper is worth, or be mentioned, or not. So this raise a question about the education, if sport scientists are educated enough to be able to critically analyze these studies. But of course, that's another issue.
Franco impellizeri:
So I think that the review process is not perfect, but I think that we don't have to think how, up to the review process, how it's just a minimum quality check, but the real quality check should be done after. And this is the reason why sometimes papers are retracted, because there is a scientific community reading the paper, understanding the pitfalls and the mistakes and [inaudible 01:01:24] these mistakes and sometimes are so severe that the papers are retracted.
Franco impellizeri:
I think that maybe where we need to improve for sure in sport science and sport medicine is in accepting this criticism, because most of the time there is the trend to hidden the problems of the studies. And this is a question that we should, in some ways as scientists, we should address in the future. Not hiding the problems. There's nothing bad in admitting that one made a mistake. So I don't see why we take this as so bad in sport science? Well, I mean, in other fields.
Blake McLean:
It's a difficult mindset for high performing people in all fields I feel like. Be it athletes, coaches, scientists. Very difficult in the sports environment. It's almost like part of the culture is that you don't make those mistakes, so almost admitting to them and readjusting is just difficult within the culture, I think.
Franco impellizeri:
Yeah, and just because Lauren has [inaudible 01:02:36] the question. Yes, sometimes as a scientist we have push, we have pressure to publish as soon as possible, and I agree, this is a source of potential errors. This is a bit problematic, because we need more balance between the fast publication and the amount of publications and their quality. I think-
Aaron Coutts:
[crosstalk 01:03:00] resources, it brings early in a career, in particular. Early in their career the resources for scholarships and for jobs, for publications. We've got to get our focus back on quality and our system needs a shift. But I don't think we're going to do that today. But I think we all agree that we need to shift that in the academic system.
Franco impellizeri:
... I would say that [inaudible 01:03:23]. The supervisors have a responsibility. So when I see studies that are weak because they're very basic mistakes, there is a PhD student involved, and when I maybe complain about these big mistakes or basic mistakes, I'm a bit uncomfortable because in reality, in my opinion, the responsibility is not of the first order, which is a PhD student, so someone that is there because he's learning to do research, it's of the supervisor.
Franco impellizeri:
So what makes me a bit uncomfortable is why supervisors supervise PhD students in topics that they don't handle. That's the point. So I understand that you have to follow the wishes of the student. Maybe the student would like to address a specific topic where you are not an expert. But you have a responsibility. And if you are not an expert, you should involve an expert in your research team.
Franco impellizeri:
But when I see papers where they address topics that, for some reason I know because for example, I was addressing in clinical setting, and they don't see people that are clinical expects, I think this is risky. And not necessarily that the paper is bad, because maybe they read a lot of papers, a lot methodological papers, that they know where, how to address.
Franco impellizeri:
But this is a source of potential errors. So in order to be fast and reduce the mistake, I think you need to have in your team someone that is an expert, or you need to spend time as a supervisor to read much more in depth the studies and the methodological papers and sometimes textbooks on that topic. Because I know errors that reading a couple of papers you wouldn't do. So this means that these basic or seminal papers have been not consider at all.
Aaron Coutts:
[inaudible 01:05:34] questions have we? But I think your time's almost up.
Blake McLean:
Yeah, we're running a little bit over time here. Super interesting discussion. Thank you Aaron, Franco for making time for us this morning. Everyone who's jumped on to listen in, thank you very much, and we hope to keep the conversation going with everyone and continue to evolve, both I think from a scientific and academic standpoint, but then also how that filters down into the application at the coalface, kind of hitting high performance as well. So thanks very much guys for your time.
Aaron Coutts:
No worries.
Franco impellizeri:
[crosstalk 01:06:08]
Aaron Coutts:
Thanks for listening everybody. Hopefully it was a useful event.
Blake McLean:
Cheers guys.
Franco impellizeri:
[inaudible 01:06:15].
Enjoy the webinar? Get notified about the next one.
About the webinar
Hear from world-leading researchers about their experiences and learnings from working in high performance sport. They will discuss common biases and fallacies that are encountered in sport science, and share their own experiences with these issues. In this webinar, Coutts & Impellizzeri will explain why we should learn to embrace uncertainty and synthesize information to support informed decision-making in high performance sport settings.
As world-renowned experts in the field of human performance, Distinguished Professor Coutts and Professor Impellizzeri have extensive experience in producing impactful and evidence-based research that improves human performance.
Distinguished Professor Aaron Coutts
Aaron Coutts is a distinguished professor in sport and exercise science and Director of the UTS Human Performance Research Centre. Aaron teaches in the areas of applied exercise physiology, exercise prescription and research design and statistics. His research is focused around developing evidence-based methods for improving performance and health of athletes. Aaron also provides sport science advice to several top-level professional football clubs and sporting organisations.
Follow Aaron Coutts on Twitter.
Professor Franco Impellizzeri
Franco is a professor in sport and exercise science and medicine, with extensive experience in applied sport science (both as a researcher and practitioner) and orthopaedics (clinical outcome studies and clinimetrics). His research interests include research methodology and various aspects of exercise science (training and testing) for both high performance and health-oriented outcomes. Franco has trained top level Olympic athletes in different disciplines (winter and summer sports) and collaborated with international sport organisations.
Follow Franco Impellizzeri on Twitter.