434
Thoughts??
(mander.xyz)
A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.
Rules
This is a science community. We use the Dawkins definition of meme.
I learned how to do a fucking LOT of statistical shit in my degree. I also learned to get REALLY good at all kinds of shit in Excel.
Guess which helped my career on an actual practical way the most? Guess which made people seek me out at work for help with things?
Sometimes Excel is what's available. Sometimes it's just faster to do it that way rather than code up some ridiculously overdone solution in some programming language. Having both skills is best, but don't shit on opening an excel and just fucking getting it done, whatever it is.
If used right, it can also be a great equalizer with those less technically skilled in your workplace. You can quickly format and tune things and even layer a little bit of vba to make their lives easier without having to get into the complexity of an entire bespoke coded solution.
Also, a reminder for those in the back. For most of us, we aren't in college to learn a specific skill so much as we are there to learn how to be taught. To prove we are capable of taking instructions and producing results as requested.
If you never understand this, then you'll never understand later why you fail to land a high quality job.
"Sometimes Excel is what's available."
I worked for a Big Company that was cutting back and dropped their Oracle contracts, forcing all the DBAs to work in Access. Then they fired all the DBAs, forcing everyone to either try to figure out Access or switch to Excel. Guess which way they went.
In my last job at that company, my department had built an Excel spreadsheet (database) so large and full of calculations they had to request money to update our machines to 64-bit Windows and 16GB RAM just to run it.
I really like this idea, but prefer one small change: I think it's best to learn how to learn.
Learning how to be taught is part of that, and a large part. Understanding when to absorb information, rely on experts, and apply yourself until you improve is fundamental. You won't get any arguments from me there.
But being taught is only one facet of learning. Sometimes experts aren't really experts, or don't have the learner's best interests at heart, or omit things to protect their own interests or ideology.
Learning how to learn involves fostering fundamental curiosity, not being afraid to fail, asking all the questions even dumb ones or those with seemingly obvious answers. Finding out "why" something works instead of just "how". Fundamentally curious people who learn as a habit tend to also develop a scientific method-like approach to evaluating incoming information: "Ok, this is the information I'm presented with, let's assume the opposite, can I prove the null hypothesis?" This acts as a pretty good bullshit detector, or at the very least trains learners to be skeptical, to trust but verify, which is enormously important in the age of misinformation.
Being taught generally tapers off as someone gets older, or becomes an expert. Learning never needs to taper off, so long as your brain still works.
https://www.visidata.org/
Blows Excel out of the water, and it's not even close. And it's free, open source, and completely extensible (with Python, not some godforsaken excuse for a programming language).
You let me know when the vast majority of workplaces have even HEARD of this, much less adopted/allowed it.
Learning to learn is what the 12 years of babysitting we all go through is supposed to be doing. The fact you overlook that is why we have a >50% illiteracy rate in the USA. Post secondary education is 100% about learning advanced skills and developing the techniques needed for a career. Saying otherwise is why companies are looking for doctoral degrees for entry level positions and they can all burn in hell.