His comment didn’t address two key issues for me:
- The “crunch”/tight scheduling of projects which led to sloppiness to begin with
- The constant need to correct, ranging from simple mistakes to very problematic methods.
I’ve been enjoying solely the WAN Show, but hearing about constant mistakes in benchmarks while praising “We want to show factual information on benchmarks for once.”, is rubbing me in the wrong way. You can’t rush benchmarking without QA and publish those results as fact. You get to choose for accuracy, or fast to churn content.
And Linus not mentioning something concrete on the first issue is worrying to me, not showing a clear intent to ease on rushing those benchmarks.
Not to mention, it’s worth taking down a video if benchmarka are wrong even if the conclusion is “most likely to remain the same”, which one cannot conclude with certainty without redoing it. It would be better transparency wise to either not knowingly publish wrong information, or put a more clear notice on said videos besides the description and a pinned comment.
Does anyone have any suitable “alternatives” in podcasts which are similar to the WAN Show in atmosphere? (As in: longer conversations about tech-adjecent topics) I don’t feel comfortable wanting to go near LMG’s content (my only consistent watching was the WAN Show to be frank) after these discoveries.