One of my Core Memories when it comes to science, science education, and education in general was in my high school physics class, where we had to do an experiment to determine the gravitational acceleration of Earth. This was done via the following mechanism: Roll a ball off of a standard classroom table. Use a 1990s wristwatch's stopwatch mechanism to start the clock when the ball rolls of the table. Stop the stopwatch when the ball hits the floor.
Anyone who has ever had a wristwatch of similar tech should know how hard it is to get anything like precision out of those things. It's a millimeter sized button with a millimeter depth of press and could easily need half a second of jabbing at it to get it to trigger. It's for measuring your mile times in minutes, not fractions of a second fall times.
Naturally, our data was total, utter crap. Any sensible analysis would have error bars that, if you treat the problem linearly, would have put 0 and negative numbers within our error bars. I dutifully crunched the numbers and determined that the gravitational constant was something like 6.8m/s^2 and turned it in.
Naturally, I got a failing grade, because that's not particularly close, and no matter how many times you are solemnly assured otherwise, you are never graded on whether you did your best and honestly report what you observe. From grade school on, you are graded on whether or not the grading authority likes the results you got. You might hope that there comes some point in your career where that stops being the case, but as near as I can tell, it literally never does. Right on up to professorships, this is how science really works.
The lesson is taught early and often. It often sort of baffles me when other people are baffled at how often this happens in science, because it more-or-less always happens. Science proceeds despite this, not because of it.
(But jerf, my teacher... Yes, you had a wonderful teacher who didn't only give you an A for the equivalent but called you out in class for your honesty and I dunno, flunked everyone who claimed they got the supposed "correct" answer to three significant digits because that was impossible. There are a few shining lights in the field and I would never dream of denying that. Now tell me how that idealism worked for you going forward the next several years.)
For those who are actually interested in this field, the proper way to measure this would be with a four point probe. You do need a constant current source and a high-impedence voltage meter, though.
Also, you don't need to solder wires to the sample. But if you want to measure the hall resistance of a thin film of a semiconductor, you can solder a glob of indium on to four corners of a 1 cm x 1 cm wafer, put it in a magnetic field, and then do basically the same measurement as four point probe, except not inline.
I read this in 1999 when entering university. It was so refreshing hearing a student provide a glimpse into the boots-on-the-ground reality of undergrad life at these world-renowned institution.
The closing sentence is also prescient; the author pivoted to CS, ultimately completing his doctorate at the University of Wisconsin at Madison
LinkedIn has him as Staff Software Engineer at Google: https://www.linkedin.com/in/lucas-kovar-185a3531/
I'm pretty sure he's rolling in cash now :-)
But still doesn't have any women ;)
Not so fast... You might be surprised.
Someone send him a link to this HN post and invite him to join us!
I read it about the same time. My friends and I (all of whom declared Physics and most of us switched to other majors before graduating) had tears in our eyes reading it. Funniest thing I had ever read.
I'm glad he's doing well.
I TAd a semiconductor fabrication lab class 20-odd years ago. Mostly it was about making sure the students had the absolute fear of God put into them about working with HF, but there was also a bit at the end where you actually got to do a voltage sweep and characterize your transistor. If in fact you had made a transistor rather than a needlessly complicated resistor. The other TAs and I passed this paper around and thought it was just hilarious.
And then there are Etsy moms making frosted shot glass
With HF?! Got a link to this madness?
I would make them reread the MSDS.
I've seen an MSDS for sodium chloride USP that recommends against use in food, and says that you should wash your skin with abundant water for 15 minutes if you contact it and seek immediate medical attention if it gets in your eyes (after, of course, spending 15 minutes in the eyewash station). It also warns you to keep it away from sources of ignition, that it should not be released into the environment, and that you should not handle it without gloves and face protection.
Here, this is the first sodium chloride MSDS I googled up: https://www.fishersci.com/store/msds?partNumber=S64010&produ...
Fortunately, HF is as safe as NaCl. Or so I take you to suggest, this otherwise totally failing to follow from anything anyone has said...
> (...) the apparent legitimacy is enhanced by the fact that I used a complicated computer program to make the fit. I understand this is the same process by which the top quark was discovered.
This is both hilarious and more common than you might think. In my field of expertise (ultrafast condensed matter physics), lots of noisy garbage was rationalized through "curve-fitting", without presenting the (I assume horrifyingly skewed) residuals, or any other goodness-of-fit test.
I took some effort to change my research interest from computer vision to DFT calculation in quantum chemistry.
Honestly, I'm kind of frustrated now, too many work is close-source in this area. The research paper will tell you everything except how to reproduce this work in minimal effort, it's like they are hiding something.
They also using a `Origin` to plot and MS Word to write paper, which is also non-free licensed, and made them harder to collaborate and reproduce.
> The research paper will tell you everything except how to reproduce this work in minimal effort, it's like they are hiding something.
They are. I used to work in an adjacent field. Everyone was open about doing it - they're competing with others for grants, and worry that if they reveal the secret sauce, others will move faster than they can.
You can say you performed a DFT calculation to get the result, but anyone who's studied these types of simulations/calculations knows that it's highly nontrivial to implement, with lots of coding and numerical tricks involved. So it's extremely hard to reproduce if you don't have detailed access to the algorithms.
Very true that they're hiding things. I actually wrote some code (that strung together other people's code) to complete a simulation pipeline for non adiabatic molecular dynamics. I was tasked with writing documentation to teach the group but was instructed to not release it anywhere publicly because other groups would simply take the method and move faster since they had more money and compute.
This issue also bugged me for a while. It is more of cultural issue, and older the research group is, the less likely it is for research software to be open, in my experience.
In the area of deep learning based simulations, one good example of an open software is netket. The researcher their is pretty active in terms of github/gitlab/huggingface ecosystem.
I miss OriginPro in my undergrad when we had campus licenses for, before moving to matplotlib for data visualization. matplotlib is simply too disappointing for making publication quality figures. The most recently encountered problem is how to plot with a broken x-axis, which is one of the most basic need in physical science but requires a non-trivial amount of hacking to get with matplotlib.
Open source tool or not, I don't care at all as I get the science right. I have already enough frustration dealing with my samples, so I simply want the least frustration from the software I use to plot.
Matplotlib is a bit painful. Often seaborn will work quicker, especially when using Pandas dataframes with proper column names and seaborn compatible layout.
Its annoying that you cannot create a broken axis out-of-the box, but I am sure you can wrap this to make your own convenience function: https://matplotlib.org/stable/gallery/subplots_axes_and_figu...
That link was what I referred to after Googling, but in my case I need the width of the left part and the right part to be different, which requires setting width_ratios in the subplots and adjusting the slope of the hacky lines used to draw the broken axis symbol. seaborn also would not help in this exact case.
There is a package by some nice guy: https://github.com/bendichter/brokenaxes just to do the broken axis. But not being built-in in Anaconda is already an annoyance, and in my case it generates a figure with a ugly x-label.
I ended up letting ChatGPT generate the code for me with the two required hacks. I simply need the figure in the minimal amount of time and with the least mental bandwidth, so I can focus on the science and catch the conference deadline. Origin is a very "over-engineered" piece of software, but hey getting a broken axis is so simple (https://www.originlab.com/doc/Origin-Help/AxesRef-Breaks ). Sometimes the "over-engineering" is necessary to minimize users' pain.
Honestly, if you're doing scientific work there is no reason not to output the data somewhere and plot in R with the standard lib (insanely good for science style plotting but hard to use) or ggplot (what matplotlib wished it was)
Honestly, when it comes to hacking things together with matplotlib I outsource all of my thinking to chatgpt to do the 80% of doc hunting that is honestly not worth it since everything in matplotlib is labelled inconsistently.
It takes a special kind of mind to appreciate this short post, not as fiction, but as truth and also as a jab at the physics sciences in general.
More jabs available at https://pages.cs.wisc.edu/~kovar/bio.html and https://pages.cs.wisc.edu/~kovar in general.
Why is it a jab at physics? It's honest and beautiful -- I imagine this is exactly what an experience on the cutting edge of experiment is like! :D
Making this measurement (an ancient discovery) with latest equipment is easy, but imagine what it might have been like for the people who actually discovered this property of germanium. Our tools/probes cannot advance much faster than our understanding of a (related) subject -- we are constantly inventing/improvising tools using cutting edge scientific knowledge from a related field.
I mean if you didn't already know how to solder to Germanium crystals you would have had to spend months experimenting with the material before you could get leads to stick.
Google said (AI result):
Soldering a lead to a germanium crystal typically involves using a gold-germanium solder alloy (like 88% gold, 12% germanium) due to its compatibility and good bonding properties
Also one of the search results implied etching first could help remove germanium oxide and used a different solder: https://www.researchgate.net/post/How-to-solder-germanium-wa...Plus you'd need to decide how to get a good thermal connection to set the temperature of the crystal - maybe via one big lead?
Being in the future makes some things simpler?
The little experience I've had with lab physicists showed they needed a good ability to build, debug and maintain their own equipment. You can't always rely on technicians.
In most but the very richest physics research groups there are no such thing called technicians. Except for shared equipment in centralized managed facilities like the nanofabs, even there you need to tune your own recipe...
Especially when the entire concept might seem absolutely absurd at the time.
[dead]
I'm an industrial physicist, and the post put a smile on my face. And indeed, it's not fiction. It's a blast. You will go through times like this, I guarantee it.
I've been wrestling with a cantankerous experiment for a couple of weeks. It produces reproducible results, but they don't make sense, and the work is not in a domain where discovering new physics by accident is likely.
I understood and appreciated it, and I’m not special
I appreciate it just from reading enough HN and XKCD
(2000)
(at most: https://web.archive.org/web/20001031193257/http://www.cs.wis...)
https://pages.cs.wisc.edu/~kovar/cv.html
Looks like he went on get a PhD in CS and is now a staff SWE at Google, according to his LinkedIn. Guess he's rolling in cash after all.
You're right, I looked up and he seems to work at Google as a SWE: https://www.linkedin.com/in/lucas-kovar-185a3531
Happy he made the leap and at least get's paid well now (I hope).
The last line seems strangly intelligent:
> I still wouldn't have any women, but at least I'd be rolling in cash.
Did they get a girlfriend?
Fear not, he's being paid well.
That this is the chosen path says alot about how we as a society allocate money and value things.
> (2000)
It was probably actually written sometime prior to June 1999, because that's when the author got his Physics BS at Stanford (https://pages.cs.wisc.edu/~kovar/cv.html).
I kinda want to know more of the backstory around this. What grade did he get? Or was this a private venting exercise he later put up on his webpage, once he was well clear of the course?
The author did eventually go into CS, I wonder if this project was his actual breaking point.
Yeah I want to say I remember this making the rounds (remember email forwards?) during my first year of undergrad ('99-'00) but I'm a bit fuzzy on the exact timing.
Not that it is important, just spotted that the page's HTTP headers report impressive
Last-Modified: Sun, 26 May 2002 22:33:04 GMT
(And the HTML code structure matches that era perfectly.)While the post is amusing, somebody needs to say it: Band structure and the theory of solids is some of the most beautiful physics out there. The fact that it has completely altered society as we know it is merely secondary. :)
Very much my undergrad lab education experience...
I currently write my master's thesis in experimental quantum computing - the platform is similar to what Google published in December, just with less qubits. A lot of it just comes down to how much money the lab can spend to get the best equipment and how good your fabrication is.
You can have the best minds in experimental physics, but without the right equipment the grad students are just busy trying to make things work somehow and waste months if not years away.
I spent a good minute looking at the exponential in graph, ignoring all the actual data points, thinking to myself that the experiment does show an exponential relation. Where's the lie?
Guess that's the power pictures have over words.
> ignoring all the actual data points
Well that's your problem.
The line is the predicted, not actual. How would you derive that line from plot of noise?
>> I drew an exponential through my noise.
The issue is that there was supposed to be a curve according to his reading, but the actual had no measurable trend. It's possible that the data was measured on the wrong scale. If you zoom out, those noise plots become a line segment. Then again, the predictable line is on the same scale (and we're assuming that it's correct according to his reading or the best he could fit) so zooming out would probably be a different form of lying with statistics via overfitting.
There should be some more examples in how to lie with statistics?
Believe it or not, there's an entire book about it!
Yes, it's called "How to lie with statistics" :)
I believe this is commonly known as marketing.
[dead]
The main lesson I was taught by undergrad chemistry and statistics is that the point of science is to lie and massage your terrible incorrect results until they look realistic, claim any remaining error was due to the shitty inprecice equipment you're saddeled with, and turn it in, because you don't have enough time or money to try again.
Oh, BTW, the whole "Friction is directly proportional to the normal force": My Ass!
I could never reproduce it well in the lab, because it's really not true. Take a heavy cube the shape of a book. Orient it so that the spine is on the floor. It's a lot more friction to move it in one direction than in the transverse direction. Yet the normal force is the same. Any kid knows this, and I feel dumb it never occurred to me till someone pointed it out to me.
Friction is proportional to the normal force, more specifically, it is the normal force times the coefficient of friction.
What you are describing (if the normal force is actually the same) is a contact situation where the coefficient of friction is different in different directions (anisotropic friction.)
Not true in practice for a lot of materials
The “proportionality constant” is doing a lot of work in that claim. A lot of “constant” parameters are swept under the rug. If you fix enough stuff that claim is indeed correct, although I agree a bit simplistic
Is this possibly because you need to use additional force to horizontally stabilize it in one direction (perpendicular to the spine) but not the other?
I was about to say exactly this.
Applying force directly to the center of gravity with one finger is hard.
You end up applying torque plus adjustments in response to that torque. And that is heavily dependent on your moment of inertia, unlike the normal force.
But I do agree that explanations of friction are right up there with “how do airfoils work” where poor instructors are liable to get long past the edge of their knowledge and just make shit up.
Yep, cars can accelerate at over 1g.
Anyone who did undergrad lab work around 2000ish might throw in some comment about lab view software and the number of times it crashes and loses all your data
2000s? My university's wind tunnel instrumentation was mostly LabView.
It's been around a very long time and continues to be relevant. It's just a window in time where it was feasible to have a graphical application made on labview to be accessible to undergrads crossing over with such a thing being quite unstable.
Lmao my entire undergraduate physics program is still entirely labview instruments.
hahaa, I love it! That's right there is engineering and true work and dedication. Can hear the frustration and it's 100% warranted.
I wish universities were better equipped for what you pay. Where is all that money going anyways? Leaking like free electrons?
The 2023 education and general fund budget for Penn State allocated 5.7% to equipment and maintenance and repairs of approx $2.5 billion in use. I assume that would include thing other than just lab equipment.
Overwhelmingly, most education fund use goes to salary, benefits and student aid (~$2 billion, 81%).
Interestingly the amount of money raised by tuition and fees almost exactly matches the amount spent on salaries, benefits and student aid. So one way of viewing it is that things like lab equipment are basically funded by grants, gifts, and state appropriations.
I assume this would be similar at Wisconsin in the late 90s, I doubt universities have changed much.
Maybe research budgets offer more flexibility and better equipment but I doubt the undergrads get to touch that stuff.
Source: budgetandfinance.psu.edu
Wow, you went down the rabbit-hole, much appreciated! Do I understand correctly, that your analysis boils down to the money is indeed being pocketed rather than used for equipment or improving anything at all at the university?
The gym, I think. Usually the brand new buildings, too.
It’s going to the salaries of a few elite people in the university system. It’s not that far off from the wealth inequality of the real world.
> I should've declared CS. I still wouldn't have any women, but at least I'd be rolling in cash.
Should we tell him?
A thing of beauty is a joy forever - John Keats
Honestly, physics is so full of pretension and hero worship. Even among seasoned lecturers there's a tendency to mythologise the progress of the art by making it sound like all the great results we rely on were birthed fully-formed by the giants who kindly lend us their divine shoulders.
Ironically there's a kind of Gell-Mann amnesia here, working scientists know that must of your work will consist of stumbling down blind alleys in the dark and looking for needles under lampposts that aren't even near the haystack.
I'm reminded of an anecdote which I can't currently source, but as I remember it Hilbert was trying to derive the Einstein Field Equations by a variational method. He correctly took the Ricci curvature R as the Lagrangian, but then neglected to multiply by the tensor density, sqrt(-g). This is kind of a rookie mistake, but made by one of the history's greatest mathematical physicists.
Anyway I love this article, it's a breath of fresh air and rightly beloved by undergrads.
(edit: for a counterpoint to this work please see another classic: "The physics is the life" -http://i.imgur.com/eQuqp.png )
There's a single instance in Einstein's notebooks where he attempts to use numerical methods to come up with a result. He manually graphs some result of the cosmological constant and then integrates it by counting the squares under the curve.
An esteemed emeritus professor of engineering I know used to cut out the graph and weigh it on a sensitive scale to integrate. It was not an uncommon technique.
I find that hard to imagine, considering we're talking about coupled partial differential equations in four dimensions. Well, if that's true, it really goes to show his desperation, I guess.
There seems to be a bit of confusion about the Hilbert-Einstein controversy [1], and I believe consensus is that Hilbert derived the equations a few days before Einstein, but did not claim ownership of the research. But this is the first time I'm hearing that Hilbert made a mistake. (I mean, maybe he did, but he got the right result eventually.)
[1] https://physics.stackexchange.com/questions/56892/did-hilber...
I was about to link you what I thought was best coverage of the priority I knew about, https://www.science.org/doi/10.1126/science.278.5341.1270 but now I see that's in the second edit of the accepted answer at your link.
(I certainly count myself among the confused, but I don't think there's any real dispute to answer.)
See also: this work alleging some foul play in the historical record - https://www.degruyter.com/document/doi/10.1515/zna-2004-1016...
I feel like forgetting to multiply by sqrt(-g) must have been a pretty easy mistake to make back then. This stuff was new!
On the contrary, what is presented by the OP is one of the many reasons that worship of science's heroes, unfashionable for decades, a whiggish pablum, is justified. If great results were birthed fully-formed -- a view I've frankly never heard anyone profess who has bothered to consider such things even briefly -- they would hardly be any heroes. Even little children who reflexively chomp on every superhero film aeroplaned towards their face understand this.
Just physics is like this? Hero worship like this is pretty endemic.
It’s weird because on one hand it promotes this disempowering mythology that all progress comes from a vanishingly tiny fraction of humanity, but on the other hand people find it inspiring because if heroes exist then it means people (and maybe you!) can do amazing things. It’s a weird double edged sword.
Fwiw I certainly didn't mean to say this is unique to physics, I'm just not qualified to comment on other fields. Furthermore you make a good point, the hero worship is fruitful. Anecdotally I'd say a full third of my undergrad cohort cited Feynman's auto-hagiography as part of their decision to study physics.
(I also note that any double-edged polyhedral sword is necessarily degenerate.)
But that so-called best fit line is not an exponential. Exponential functions are convex, that line is concave.
I’m afraid you’ll have to repeat the experiment.
It’s not literally a pure exponential function but it might have exponential terms as opposed to polynomial or linear terms
For me, it wasn't the subpar equipment, it was the subpar instruction. I will never forget trying to explain to the graduate TA leading my circuits 1 lab, that, no, you can not use a multimeter to measure impedance of an element in a circuit while the circuit is live, and that that is dumb for multiple layers of reasons.
He got pissed off at me for questioning his authority, I told the class "Uh, guys, why don't we all wait until [GTA's name] and I talk this out to proceeded, unless ya'll want to be replacing fuses in the multimeters" that REALLY pissed him off.
He was yelling. He told me I needed to talk to him in the hallway. I informed him that if I was wrong, this would be a great lesson for the class, and that, no, I will not being going somewhere to be yelled at in private, anything he had to say could be said there. That really did it. He yelled more. I was laughing at his tantrum. He took me up to the lab lead (not the prof overseeing the class - not 100% sure of how this person fit into the the hierarchy), intending to get me kicked out of the class for disrespect. He goes on to this guy about how I'm the worst, and I just stand there, smiling.
Finally, lab lead guy has gotten tired of the second hand yelling, and asks for my side - He wasn't oblivious to the fact that I'm sitting there fiddling with my 12AX7 necklace while leaning on my longboard I burnt with high voltage. I oozed the hardware hacker ethos very visibly - and I respond simply "He told the class to measure impedance, with an ohmmeter, while the circuit was live"
It was at that moment I learned it was this lab lead's role to repair equipment (or at least replace fuses) when things like this happen.
Watching that GTA have to tell the class "I was wrong" after he was yelling at me in front of everyone had to be the best.
---
Fast forward a year, and I got to deal with even more mind numbing stupidity: https://opguides.info/posts/whydidipay/#8---senior-spring-20
We went to school together :) I would agree with Prof Sayood's "Signals and Systems" was a great class. I would agree that many TAs, including myself when I was a TA, were confused and/or overwhelmed.
If anyone is wondering what the author is up to these days, apparently he's a staff engineer at Google according to his LinkedIn.
>Do you have any idea how hard it is to solder wires to germanium?
I've just soldered SOT-723 onto SOT-23 adapter board, I can solder anything to anything
Seriously I wish more science writing resembled this
Looks like he didn't measure the temp correctly, who knows what the real temp was inside the crystal.
A classic I will never not upvote.
Maybe the frustrations of undergrad lab work would be easier to swallow if they were better situated in historical context. This kind of result should give the experimenter some sympathy for the folks who originally made these discoveries, with less knowledge and worse equipment. But I don't think it's usually explained that way.
Can we agree that this is one of the greatest abstracts of all time?
This reminds me of how the Fahrenheit scale came about.
For all its flaws, Fahrenheit was based on some good ideas and firmly grounded in what you could easily measure in the 1720s. A brine solution and body heat are two things you can measure without risking burning or freezing the observer. Even the gradations were intentional: in the original scale, the reference temperatures mapped to 32 and 96, and since those are 64 units apart, you could mark the rest of the thermometer with a bit of string and some halving geometry. Marking a Celsius scale from 0 to 100 accurately? Hope you have a good pair of calipers to divide a range into five evenly-spaced divisions...
Nowadays, we have machines capable of doing proper calibration of such mundane temperature ranges to far higher accuracy than the needle or alcohol-mix can even show, but back then, when scientists had to craft their own thermometers? Ease of manufacture mattered a lot.
They say two points but it was really three. The ammoniac mixture at 0F, water freezing at 32F and body temperature at 96F.
Also Celsius, for whatever reason, originally put boiling at 0 and freezing at 100. Maybe Sweden is just that cold.
Jame's Burke's "Connections" series covered this in series 3 episode 10. Here's that clip:
100 + 28 degrees are not harder to mark than 64, and then aim 0 and 100 properly. :-/
What would be the process to do that? To aim 0 and 100 properly, you'd need a tool to calculate a 100:28 (25:7) ratio on an arbitrary distance, wouldn't you?
One can build such a tool, but it's not a doubled-over piece of string.
Surely you can simply use a ruler and rotate it away from the parallel to achieve any arbitrary scale, right?
Well... Yeah. It costs money and/or time to make things. That's as much true now as it was back then, but I don't think a ruler would have been particularly difficult to make, even with 18th century technology.
Make marks on the thermometer at 0 and 100 degrees C, then project light from a candle to a wall to see these marks with say 5x magnification. Now project marks from the 128 mark ruler to the same wall and align marks from both, then place marks on the thermometer with 5x better accuracy.
Sounds doable, but again, you're comparing that approach to:
- get some string
- measure a length between your low and high points
- fold it in half
- make a mark at the halfway point of the string
- fold it in half again, etc.
No candles, projection, transparent or slotted ruler, wall, or carefully moving one's hand back and forth under projected magnification needed. Just some string.
Henry Cavindish used them to measure G. I used lasers and never got even an order of magnitude closer.
Nice chart. Can't rule out the old null hypothesis eh!
so funny. i've read a few chapters of Discworld books that made me titter a lot less
maybe related?
Cracks in the Nuclear Model: Surprising Evidence for Structure
I'll repeat the same comment I made to the same article when it posted here about a year ago:
As an odd coincidence, I did the same experiment on a shoestring budget with substandard equipment also. I too used a fancy computer algorithm to get a best fit. Except that I managed to get four significant decimal places in the result — an improvement over the (also outdated) textbook.
The author of the angry rant had a life-defining experience of overwhelming frustration.
The same scenario resulted in a positive life-defining experience for me
It’s funny how unpredictably things pan out even in identical circumstances…
Try faking your data next time, dude! You will be famous for some time. Do you even know how hard it is to make data points that seem natural but follow some clear pattern you want it to follow? I spent a good half of a day looking for that proper inverse formula.
ROTFL at the abstract
Nobel prize, quick!
> Do you have any idea how hard it is to solder wires to germanium?
Has science gone too far? :D
Brilliant man.
A+, recommending for accelerated PhD program.
This should be a reminder that more than you would expect, "the results didn't replicate" is really a statement of how difficult science is to do well.
[dead]
[dead]
[flagged]
I also regret studying physics, lol, although in my case I thought fiddling with algebra would be the best job ever, until I got bored of using my mind as a compiler.
This is, more or less, exactly what happened when I took Electronics I in college.
The course was structured in such a way that you could not move on to the next lab assignment until you completed the one before it. You could complete the lab assignments at your own pace. If you failed the lab, you failed the class, regardless of your grade.
The second or third lab had us characterize the response of a transistor in a DIP-8 package, which was provided to us. If you blew it up, you got a slap on the wrist. That DIP-8 was otherwise yours for the class.
I could _never_ get anything resembling linear output out of my transistor. The lab tech was unhelpful, insisting that it must be something with how I had it wired, encouraging me to re-draw my schematic, check my wires, and so on. It could _never_ be the equipment's fault.
Eight (!) weeks into that ten week class, I found the problem: the DIP was not, in fact, just a transistor. It was a 555 timer that had somehow been mixed in with the transistors.
I went and showed the lab technician. He gave me another one. At this point, I had two weeks to complete eight weeks of lab work, which was borderline impossible. So I made an appointment to see the professor, and his suggestion to me was to drop the class and take it again. Which, of course, would've affected my graduation date.
I chose to take a horrible but passing grade in the lab, finished the class with a C- (which was unusual for me), and went on to pretend that the whole thing never happened.
That is enraging. I've seen similar things happen too and it blows my mind how ridiculous some of these teachers can be. I don't know if it's dehumanization of their students in their minds or an utter unwillingness to devote 30 seconds of directed attention to understanding the situation and making a reasonable judgment, but whatever the cause it is prolific. The only thing worse is when one of them will add something like, "life isn't fair, get over it" when it's fully in their power to make a reasonable determination.
The flip side of this is from the professor's perspective: some undergrad in every class will lie their ass off about why their assignment was delayed.
Unfortunately, this reality produces no good options if you think someone is telling the truth: (1) make an exception, and be unfair to the rest of the class or (2) don't make an exception, and perpetuate unfairness for the impacted student.
It's only "reproducible" if you find other 555's mixed in the shipment but not distributed to students. Depending on what the error rate in the shipment packing is, that might be very easy or it might be quite hard. At any rate, it's a stats problem that the professor is unlikely to want to engage with. Unfortunately.
For the next semester, a good prof would have a QA step or a harnass that turns on a green light if you plug in a working-as-expected package. I can see how the lab assistant job gets plenty to do in a well-run course, and also how unlikely it is to be happening in real life. There aren't enough incentives.
Translation: scammers get the green light.
It's a general problem with large bureaucracies. If you're a cog in the machine, the safest way is to always stick to the rules, and avoid any situation where one has to exercise discretion, since any personal judgment comes with potential personal responsibility down the line.
It bugs me that oftentimes there appear to be nothing but cogs (e.g. Intel)
Just wait until that teacher is your graduate advisor.
Well in the industry you have the weekly JIRA humiliation rituals, bad things are everywhere
At this point it’s the track to get a visa to work and live in the US. I’ve met so many graduate researchers who put up with way more bullshit than I would ever deal with. And why most grad programs are mostly immigrants.
I only took two electronics classes, but in the later one I was the class hero for just buying a bunch of potentiometers on amazon so that we didn't have to waste all of that expensive time sitting around waiting for our turn with the only good one left. It cost me like $10
Literal example of "bias for action." A+
> So I made an appointment to see the professor, and his suggestion to me was to drop the class and take it again. Which, of course, would've affected my graduation date.
I would have been tempted to ask him to write me a check for the extra semester of tuition, but I'm sure that wouldn't have made the situation any better (and maybe would have made him more likely to grade strictly).
>I chose to take a horrible but passing grade in the lab, finished the class with a C- (which was unusual for me), and went on to pretend that the whole thing never happened.
This sentence could have also ended "my gpa dipped below the threshold for some bullshit mark it up to mark it down exercise masquerading as a scholarship and I had to re-take the class for a better grade anyway"
Indeed it could have. I was on a fairly prestigious scholarship; luckily, my marks were good enough that this was a low-risk decision.
That said...
I graduated with a 3.2 GPA, after being the stereotypical "gifted" student up through high school. A 3.2 is, apparently, still decent. However, I did feel a bit of a twinge seeing my peers walk at graduation with with cords, bents, and other regalia, where I just had my standard-issue black robe.
It had less to do with my grade in this particular class, and more to do with the fact that I had a part-time engineering job - 10-20 hours a week - and was making money. When you've spent a couple of years being broke, having an extra few hundred dollars per month was a big deal. Enough so that I didn't really care about putting the extra effort in for A's - that extra time was time better spent working. B's were fine if I could afford to take my girlfriend out to dinner every month.
In the years since then, it seems like this was a good decision. That job became full-time after college, and I stayed there six years. At the end of six years, nobody really cared about my college GPA. At the end of nine years (when I next looked for a job), I didn't even bother listing it on my resume.
Yeah, I'm not going to say that undergrad doesn't matter, but your grades are not exactly an indication of whether or not you're getting useful life and professional skills out of it. I was a straight-A high school student, but finished university a semester late with a 2.975 GPA. I've since had a wildly successful career in software development (my degree is in electrical engineering), and my college years toiling about in labs are but a dim memory.
Certainly the name of the school on my resume helped me interview for my first job, and I did learn a bunch about how computers worked and how to design CPUs, and that was useful early in my career when I worked on embedded software (like actually embedded, weak-ass MIPS machines with a handful of MB of RAM, and no MMU or memory protection[0]; not the tiny supercomputers that count as "embedded" these days). But my grades, and most of the getting-my-coursework-done drama? Irrelevant.
[0] And I'm sure some folks here will consider what I had to work with a luxury.
If you're heading to grad school, it can be essential to have connections made through UG research. That's a trade that isn't just advised in some fields, but necessary. A letter of recommendation that says you're actually useful as an RA is 1000x more of a direct implication that they would benefit from bringing you on than an A in a few classes. Academics don't like finding out that smart people are impossible to work with any more than industry does, and since they're tied to you for longer, in some ways it's an even bigger deal.
I probably learned more in my first year of working than I did in my degree. Not just technical skills and gaps that had been glossed over during study, but also about myself as an individual. You made the right choice, and it's one I wish I had the foresight and maturity to have made at that point in my life.
What I don't understand is why it took you 8 weeks to distinguish a timer from a transistor. That doesn't make your professor's reaction alright, I just find it puzzling.
It's a good question! I didn't think to check the markings on the chip. The lab tech was convinced I was doing something wrong with my setup, and likewise he had me convinced it must be something wrong with my setup.
Coincidentally, I've been knee-deep in some problems that I've applied the Cynefin framework to. I'd call this problem "chaotic", where throwing things at the wall might be _more_ effective than working down a suggested or tried-and-true path from an expert. I was pleasantly surprised just a few weeks ago where one of the more junior engineers on my team suggested updating a library - something I hadn't considered at all - to fix an issue we were having. (That library has no changelog; it's proprietary / closed source with no public bug tracker.) Surely enough, they were right, and the problem went away immediately - but I was convinced this was a problem with the data (it was a sporadic type error), not a library problem.
Or even more likely in a lab setting: have another student test your part in their setup for A/B validation testing.
That would be like exposing a first year CS student to a situation where "it could be a compiler bug" is one of the potential explanations.
It's closer to exposing a first year CS student who has never touched a computer before to Windows, when the work is supposed to be done on Linux, and the TA is hemming and hawing, and insists that the reason the sudo command isn't working is because the student is not following the steps correctly.
It's a problem that's obvious to diagnose... If you already have passing familiarity with the material. Most people do not have passing familiarity with electronic components when they step into an engineering program.
Not just a timer IC. Literally the most common IC in the world for at least every year from 01980 to 02000 or so, maybe still today. I can understand the first-year student not recognizing it, but what the fuck was the lab tech's mental disability?
I would assume that you don't have access to the lab(and diagnostic equipment) at all times and taking other classes.
Also him being a student, having the wrong component was probably not in his mental troubleshooting tree. I would guess that it was not in the lab assistant's troubleshooting tree either.
Also once you start down the road of troubleshooting, a false trail can lead you far into the woods.
Same package. 555 is typically a DIP-8, transistor packages are available in the same. So you would have to examine the cryptic markings and compare them with the other students, and that’s only if you suspected some fuckup on the part of the knowledgeable people.
Yes, my strict adherence to “trust but verify” was born from literal tears. It’s not worth trusting others if it takes a small fraction of the projects time to verify. It has saved me incredible amounts of time in my professional life, and I’ve seen months wasted, and projects delayed, by others who hadn’t cried enough yet.
"Trust, but verify" is just a polite (ie corporate) way of saying "Don't trust until you verify", right?
If after eight weeks a junior engineer is still toiling on their story, I'd ask why someone more senior didn't get involved.
There are lots of reasons - maybe the senior engineers are overburdened with other work (or don't care), maybe the project manager or team lead wasn't asking if the junior needed help, or maybe the junior was lying about their progress.
Either way, a story that goes for eight weeks feels excessive. Much, to your point, taking eight weeks to figure out that there was a bad part feels excessive. My counterpoint is that teams don't typically operate like labs. In a college lab, the objective is for you, specifically, to succeed. In an engineering team, the objective is for the entire team to succeed. That means the more senior engineers are expected to help the more junior engineers. They might directly coach, or they might write better documentation. I don't believe that dynamic is present in a lab setting.
> Should they expect a good performance evaluation?
They should expect that particular incident to not affect their performance evaluation, since it was very much not their fault.
In your hypothetical scenario, your hypothetical junior engineer went to the senior engineer repeatedly for advice, and the senior engineer did not do their job properly:
The lab tech was unhelpful, insisting that it must be something with how I had it wired, encouraging me to re-draw my schematic, check my wires, and so on. It could _never_ be the equipment's fault.
This is a huge failure in mentorship that wouldn't be ignored at a company that actually cares about these things.
For a college class grade, everyone is supposed to be tested on the same exercise. If all students were tested under the same scenario then it would be fair. For just one student to be tested under this scenario, but for all other students to get a free pass on the lab component identification diagnostic test, is not reasonable.
While it's ridiculous to expect a student to have the skills of a professional, even a student needs to develop assertive skills to demand a replacement part. This is a basic skill for debugging hardware problems: see if problem manifests on more than one unit. Here it would be demanding another chip to try, early-on. Chips can be marked correctly but damaged or defective.
Ohm lordy, we're blaming the student for not having years of homebrew experience before he entered school? Sure any hobbiest knows what a 555 is, but when the lab assistant doesn't even catch it and the chip was handed out to the student this is not an entry-level students fault.
relatively cheap lesson in the importance of knowing your hardware.
You can create a timer with one transistor and an LC feedback loop.
This is crazy to me because when I've run labs in the past, there were equipment failures literally all of the time. When you teach lots of people, shit breaks. Quite often if something didn't work, I'd just have one student swap equipment with another student to help diagnose this sort of thing.
Major bummer that others have had differing experiences from me, here.
I ran labs in my university in Europe, in the early 2000s, and I'd like to think this would not have happened. We were selected as tutors due to our proficiency and dedication to the subject. Maybe it was a fluke, I've heard similar stories recently about local Unis.
this happens in "real life" as well
i spent a bunch of time trying to figure out why my 74LS20 wasn't being a dual 4-input NAND gate
turns out that was a date code, and it was some other chip entirely
1974 was a terrible year for 74xx series TTL chips
yes, i am old :-)
That's a tragic story. However, I'm surprised that the transistor was supposed to come in a DIP package. Usually through-hole discrete transistors come in a three-lead package like TO-92. Of course, that would not have helped you since yours looked like every other student's except the for the markings.
Probably Darlington transistors like ULN2003
ULN2003 is not DIP8
And I would assume for pedalogical purposes a bare transistor would be preferred rather than the '2008 with its extra diodes and base resistor.
I was in honors freshman chemistry at university. Tough class, all homework (lots of it) graded rigorously, but only the midterm and final counted toward the course grade. So if you wanted an A you had to get an A on both exams.
After midterm, during every other lecture at least, the professor would sound a refrain: “An orbital is not a house! An electron does not live in a house!”
Final exam had a small number of complex problems to work out with pen and paper, tough stuff, lots of calculus. But the last question ended with “where does the electron live?”
That final problem, if you ignored the end wording, was super easy, something almost trivial to do with Helium iirc. The class had about 25 students in it; about 5 of us independently had the same thought: “this is a trick question, ‘the orbital is not a house in which the electron lives!’” And, independently, that’s how we five answered.
And we got marked wrong, all our course grades dropped to B+/- because of that one damn question.
Over a lunch or whatever, we discovered our shared experience and approached the professor as a group. He listened patiently and said: “Ah, right, I did insist on that idea, it’s understandable why you would think it’s a trick question and answer that way. But I still consider your answers wrong, grades stay as they are.” Some in the group even went to the dean and, to my understanding, he said it’s best to consider it a life lesson and move on.
Having gone both to a liberal arts institution and a large public university, it is not clear to me what the professors in the latter were actually doing vis a vis their teaching responsibilities that actually provided value.
Lectures that came straight from the book I could have read, recitations and problem reviews done by grad students, and tests that were little more than variations on homework problems of varying difficulty.
Maybe they were getting paid for research, but I dunno. At the liberal arts college, I actually received an education instead of bootstrapping it myself from a syllabus.
I agree this seems overly principled to me.
I recall a DSP class where there was an exam with a question like (not exactly this):
> What does the following code print?
> `printf("Hello, world!");`
If you responded with:
> Hello, world!
...which - of course - the whole class did, you got the question wrong.
If you responded with:
> "Hello, world!"
...which is actually not what that would print, you got the question right.
A small band of us went to the professor and noted that, in fact, `printf("Hello, world!")` does not print the quotes. But he wanted us to show that it printed a string, and we denote strings by quotes.
This was something that we learned to do just for him - all strings had to be enclosed by quotes, to denote that they were strings. As far as I'm concerned, it served no practical purpose; we never had to differentiate strings like "Hello" from ['H', 'e', 'l', 'l', 'o', 0] or other representations.
A better example of how this could go - and not one that had anywhere near the same stakes - was a question on the entrance exam for my college radio station:
> What is the airspeed velocity of an unladen swallow?
I got this question right by answering, "Ni!"
(edit: formatting)
Yeah that sucks, the hard life lesson where you have to swallow your pride and go "fine, I will put quotes on the infernal thing"
But... it does not print a string, a string goes into it but what comes out of the function is not that programmatic feature we call a string in any way shape or form, what comes out depends on the output device specified, it may be ink on paper, lit phosphors, or a stream of bytes. none of which can you use in your program as a string.
sprintf being a notable exception of course. there you do get a string out of it.
Update: language is weird and the more I read my statement the weirder it gets, all I can do is add this cryptic note. "When you print a string it does not produce a string", this usually means I am wrong about something.
I have my own, similar stories but, as a mediocre student, they were lost in the noise of my own mistakes. Nonetheless it's injustice, so I still remember it. Some people on this site were really excellent students, where these deductions really mattered. I don't know how they cope.
Depending on your environment, the above printf might print nothing at all, because there is no trailing newline.
The kind of prof who never coded a useful program in his life.
Yeah? What "life lesson" does the dean think you're going to learn from that? That authority figures cannot be trusted because they will hurt you with bureaucratic stupidity. Does the dean, as an authority figure, really want that to be the lesson you learn?
A kid might see it in terms of "authority figures," but live long enough and you'll understand it's everyone. Not just your professors and bosses, but your subordinates, your friends, your lovers, even your children will judge you unjustly from time to time. But that doesn't mean the world is poison and existence is a curse. It just means you have to learn to get by despite other people's imperfections the same way they get by despite yours.
That sometimes you can do everything right and still lose.
It's funny, because while that's a terrible educational experience, you actually learned some important lessons despite them.
I remember the first time I found out that the software documentation I had been relying upon was simply and utterly wrong. It was so freeing to start looking at how things actually behaved instead of believing the utterly false documentation because the world finally made sense again.
Sometimes it's not even rare that documentation is wrong. The documentation for a vendor who I won't name - but might be at Series J and worth north or $50 billion - seems to be wrong more often than it's right.
We frequently say, don't blame the tools, it's you. That pushes "blame the tools" outside of the Overton window, and when we need to blame a tool, we're looked at like we have five heads.
Ten years ago, I was dealing with a bizarre problem in RHEL where we'd stand up an EC2 instance with 4GB of memory, have 4.4GB of memory reported to the system, and be able to use 3.6GB of it. I spent _a long_ time trying to figure out what was going on. (This was around the time we started using Node.js at that company, and needed 4GB of RAM just for Jenkins to run Webpack, and we couldn't justify the expensive of 8GB nodes.)
I did a deep dive into how the BIOS advertises memory to the system, how Linux maps it, and so forth, before finally filing a bug with Red Hat. 36 hours later, they identified a commit in the upstream kernel, which they forgot to cherry-pick into the RHEL kernel.
That's a very human mistake, and not one I dreamed the humans at Red Hat - the ones far smarter than me, making far more money than me - could ever make! Yet here we were, and I'd wasted a bunch of time convinced that a support ticket was not the right way to go.
> Yet here we were, and I'd wasted a bunch of time convinced that a support ticket was not the right way to go.
From my experiences with public issue trackers for big projects, it's very reasonable to postpone creating a new issue, and rather follow my own hypothesis/solution first:
* creating a new issue takes significant effort to be concise, provide examples, add annotated screenshots, follow the reporting template, etc., in hopes of convincing the project members that the issue is worth their time.
Failing to do so often results in project members not understanding or misunderstanding the problem, and all too often leads to them directly closing the issue.
And even when reporting a well-written issue, it can still just be ignored/stall, and be autoclosed by GitHubBot.
In my case, it was egregiously bad, because someone had cribbed docs from an entirely separate scripting language that did almost the same things. Most of the same features were there, but the docs were utter lies, and failures were silent. So you'd go down the wrong branch of an if statement because it wasn't checking the conditions it claimed to check.
Once I started actually testing the scripts against the docs and rewriting them, life got so much better. The worst part is that it had been that way for years and somehow nobody noticed because the people using that horrible scripting language mostly weren't programmers and they'd just tweak things until they could happy path just enough to kinda-sorta work.
Proving the truism that incorrect documentation is worse than no documentation.
I took and then TA'd a class where the semester long project was to control robots (it was a software engineering principles class, the actual code writing could be done in a single weekend, but you had to do all the other stuff of software engineering- requirements analysis and documentation etc).
We had a software simulator of the robots, and the first lab was everyone dutifully writing the code that worked great on the simulator, and only then did we unlock the real robots and give you 2-3 minutes with the real robot. And the robot never moved that first lab, because the simulator had a bug, and didn't actually behave like the real robot did. We didn't deliberately design the robot that way, it came like that, but in a decade of doing the class we never once tried to fix the simulator because that was an incredibly important lesson we wanted to teach the students: documentation lies. Simulators aren't quite right. Trust no one, not even your mentor/TA/Professor.
We did not actually grade anyone on their robot failing to move, no grade was given on that first lab experience because everyone failed to move the robot. But they still learned the lesson.
> because the simulator had a bug
I had something similar happen when I was taking microcomputers (a HW/SW codesign class at my school). We had hand-built (as in everything was wire wrapped) 68k computers we were using and could only download our code over a 1200-baud serial line. Needless to say, it was slow as hell, even for the day (early 2000s). So, we used a 68k emulator to do most of our development work and testing.
Late one night (it was seriously like 1 or 2 am), our prof happened by the lab as we were working and asked to see how it was going. I was project lead and had been keeping him apprised and was confident we were almost complete. After waiting the 20 minutes to download our code (it was seriously only a couple dozen kb of code), it immediately failed, yet we could show it worked on the simulator. We single-stepped through the code (the only "debugger" we had available was a toggle switch for the clock and an LED hex readout of the 16-bit data bus). I had spent enough time staring at the bus over the course of the semester that I'd gotten quite good at decoding the instructions in my head. I immediately saw that we were doing a word-compare (16-bit) instead of a long-compare (32-bit) on an address. The simulator treated all address compares are 32-bit, regardless of the actual instruction. The real hardware, of course, did not. It was a simple fix. Literally one-bit. Did it in-memory on the computer instead of going through the 20-minute download again. Everything magically worked. Professor was impressed, too.
If someone complained to us TAs during or after the lab that the simulators were incorrect, we were quite open that indeed they were, and that was not our doing, but we were okay with it because lying documentation was a part of the real world.
The professor had been doing the class with those robots for several years when I took the class the first time, but I don't know if he acquired that brand of robots because their simulator was broken or if that was just a happy accident that he took advantage of.
The lesson certainly has stuck with me- this was one lab in a class I took almost a quarter-century ago and I vividly remember both the frustration of not moving the robot and the frustration of everyone in the sections that I TA'd.
you should have gotten an A for being a real engineer
Honestly, you got more real-world electronics training out of that experience than you paid for. You are now qualified to deal with remarked or counterfeit Chinese parts and other inevitable supply hazards in the business. Be grateful!
Yes, maybe true. But it's a pity that was not reflected in their final grade.
This makes me incredibly grateful for my physics lecturers, all of whom would bend over backwards to assist their students' journeys towards learning any time any stumbled or showed a spark of curiosity that needed fanning into a raging fire.
I had lecturers give me bonus marks above 100% because I noticed issues like this and thanked me for helping to improve the course material!
These lecturers, when merely overhearing a curious "huh?" conversation between students would spend hours of their own time scouring the library for relevant information and just "leave" photocopies for students to find the next day.
> From grade school on, you are graded on whether or not the grading authority likes the results you got.
I took an exam in a high school science class where I answered a question with the textbook's definition exactly as presented in the textbook, complete with the page number the definition was found on. I knew a bit about the topic, so I then cited outside scientific sources that explained why the definition was incomplete. There wasn't enough room to complete my answer in the space provided, so I spiraled it out into the margins of the exam paper.
My teacher marked my answer wrong. Then crossed that out and marked it correct. Then crossed that out, and finally marked it wrong again. During parent-teacher conferences, the science teacher admitted that even though I answered the question with the exactly correct definition, my further exposition made him "mad" (his word), and because he was angry, he marked it wrong.
Having been on the other side of the table... there's a tactic students will sometimes use, where they don't understand the question but will simply attempt to regurgitate everything written on their notecard that is related in hopes that they'll accidentally say the right words. Sounds like you did understand it, but the volume perhaps made it look like you were just dumping. It is indeed annoying to grade.
Grading is boring, tedious, and quickly wears down one's enthusiasm. The words of M Bison come to mind: "For you, the day Bison graced your village was the most important day of your life. But for me, it was Tuesday."
Sure, we could speculate about his true unstated reasons for marking wrong my answer.
I highly doubt the science teacher marked me wrong for "dumping", though. He had every opportunity to explain that to me after I got my exam graded and I asked him about it. Then he had the opportunity to explain that face-to-face with my parents. He did not do so. He said that while I got the answer right, he was "mad", thus the mark against.
Besides, notecards were not allowed for any part of the exam, and I wrote my answer from memory. I think it was clear that I knew my stuff pretty well and was not "dumping" a bunch of bullshit onto the science teacher.
There was no indication before taking the exam that I would be punished for hurting his apparently-sensitive feelings while giving the correct answer (as he agreed I did). If there were, I certainly would have chosen a different medium for proving my command of the material.
Authoritarianism, everywhere and in all forms, seeks to shut down curiosity and critical thinking.
I distinctly remember a student arguing with a teacher for a mark.
"Look sir, here in the scrawl at the margins is the answer you just said was right"
"Yes Dylan, but this was a 1 mark question. Part of getting the mark involves putting the answer inside the space provided."
I used to write my undergrad history essays in rhymed couplets because I figured the grad assistant doing the grading would be grateful for a break in the monotony and it was faster and easier than writing an actual good essay. Probably wouldn't work in the LLM era, but it was very effective in the 90's.
> he was angry, he marked it wrong.
That’s grounds for termination to me. Seriously. I would put this man out of a job and endanger the livelihood of him and his family for this kind of shit.
And if you CAN'T terminate because of admitted emotional grading, the system is too tightly captured by outside interests to the detriment of the client: the student and society.
A teacher is a professional entrusted with the most important responsibility society can offer: training and educating the next generation. It must adhere to the highest of professional standards and expectations.
That we don't pay enough to require that without reserve is a statement on our societal priorities, and disconnected from the expectations that should hold.
EDIT: clarification/word choice
If you like watching right-wing educational propaganda, sure.
You seem very angry yourself, and willing to let that anger guide you to harming someone. Are you so different from that teacher? In fact, you might be worse, while he only gave a grade (one of many surely, likely to have no long term impact on life prospects or survival), you would have this man made homeless? Don't be so quick to assume a teacher (at least in the us) has been able to accrue sufficient savings to endure a ruined livelihood. Sounds very, very extreme to me. Might there be a more charitable interpretation of the words, might there be information that we don't have that would, say, humanize the human being you'd like to ruin? Maybe we could take the time to understand these impulses in ourselves and be the example we want rather than reflecting the pain we hate to ever increasing magnitudes.
It’s one question on a school exam, friend…
And at least the guy had the honesty to admit his irrationality when called on it. That, to me, reads more like coming to terms with his error in an edge case than it does a systematic campaign of maliciously frauding on the student
There's a certain irony in your outrage at his failure to control his emotions, even as your own rage leads you to dream of hurting his family.
Is it rage?
If he murdered someone I would put him in jail and that will harm his family too.
There is a fine line between justice and compassion and if you never cross the line to enforce justice then you have corruption because nothing can be enforced because inevitably all enforcement leads to harm.
>That’s grounds for termination to me. Seriously. I would put this man out of a job and endanger the livelihood of him and his family for this kind of shit.
Agreeing with you as a former instructor (who left academia for greener fields after completing the PhD).
I've had people cry on me in office hours because they come out with — quite literally — PTSD from instructors like the one we're discussing.
It's nothing short of psychological abuse of children, and it leaves lifelong damage.
It's worse than no instruction at all. I've had to have college kids unlearn things before I could teach them.
We've got to draw a line somewhere. I draw the line at actively traumatizing children.
That person should not be allowed to teach, period. We'd do both their students as well as themselves a huge favor by removing them from teaching.
By all indications, they'd be a happier person doing something else, where they wouldn't be driven "mad" by seeing that they've done a good job — which, for a teacher, means their students being proficient in the subject they teach.
-----
TL;DR: this teacher was driven "mad" by seeing that he's done a good job, and one of his students was really good in the subject.
Spare them from this pain.
> Right on up to professorships, this is how science really works.
Reminds me of Feynman's "Cargo Cult Science" essay[1]
Yeah, not sure I'm 100% agreed on that last statement (:[1] https://calteches.library.caltech.edu/51/2/CargoCult.htm
I would take Feynmans stories with a grain of salt, he was sometimes quite liberal with the facts when trying to make a point (in particular he liked to give the impression that he was the only smart guy in the room).
The actual history is a bit more complex and certainly is not reflected accurately in Feynmans retelling (maybe he was affected by confirmation bias?). See this stackoverflow discussion: https://skeptics.stackexchange.com/questions/44092/is-feynma...
context :
https://en.m.wikipedia.org/wiki/Oil_drop_experiment
Assuming Feynmann's statement is true, I find it even more remarkable that Millikan's electron charge research was published in Science AND won him a Nobel Prize without anyone noticing the very apparent mistake of using an incorrect value for the viscosity of air.
I can totally relate. I had the same experience in grade school science class, where the teacher assigned an experiment with a suggested solution and an invitation to come up with your own method.
I was the only person in class that chose to do my own method. And, it didn't work because I didn't account for an environmental difference between my house and the school classroom. And, he gave me a failing grade.
It really killed my interest in physics for a long time. I focused on biology from then through college.
Ultimately, the problem was that he didn't make clear that the only thing that we were being graded on was accuracy, not experimental methods or precision. (My solution was precise, but inaccurate; whereas the standard solution was accurate but imprecise) Also, it's possible everyone else in class knew the culture of the school, and I didn't because it was my first year there. So, I didn't realize that they didn't value creativity in the way I was used to.
We had the task of building a highly insulated small house. Big enough to hold a hot cup of tea (and meassure how good it holds its temperature inside).
Our design was very, very good in that regard. (I used insulation building material from the house my family build at that time) But granted, it was not so pretty.
But that was not a stated goal. But when it came to grades, suddenly design and subjective aesthetics mattered and a pretty house, but useless in terms of insulation won. And we did not failed, but got kind of a bad result and I stopped believing in that teachers fairness.
I mean, the other side of the coin is that engineering schools are a giant circle jerk that churn out thousands of graduates every year who if left to their own devices will design things that cannot be made out of inputs and using processes that are not appropriate.
I'm not saying you gotta prioritize looks but you gotta think a few steps ahead and understand what the ancillary criteria that will make or break a design all else being equal, or nearly equal are or what the unstated assumptions of the party evaluating your work (e.g won't look like ass, can be made in volume, etc.) are.
I get that, and think that "pretty" is a dumb goal because what's "pretty" is usually just cargo culting of whatever works. But I think reading your customer is a useful lesson, but probably should be taught intentionally not accidentally via bad teaching.
The irony is that you learned something. Failure is a very useful learning opportunity in understanding what affects the success of an experiment, so long as you analyse it and demonstrate that, which arguably is where you should have been encouraged and graded. Compared to accidentally succeeding while following a standard procedure.
I write learning software, and this is an interesting pedagogical weakness we've become aware of when giving feedback (the asymmetry of learning opportunity in correct vs incorrect). It can be improved through overall design, and in a digital context there are also other opportunities.
Yes he learned to avoid physics. Good job teacher!
> scepticism of the validity of learning institutions
Can confirm, this is solidly wedged into my opinions now. There were a lot of other experiences after this to compound that feeling.
In High School, I started looking aggressively for a less traditional path and fortunately found one. It really saved me, because I was forced back into the traditional environment in Senior year of high school, and my grades tanked from top-of-class to "you might need summer school to graduate" level
Things got a lot better in college, because that experience (among others) helped me effectively navigate the institution, jump directly into more advanced coursework, and earn more freedom to study things that were interesting to me.
I did get a job in my field out of college. So, my college pedigree was useful practically (though not really any knowledge I got there). But, I'm self-taught dev now, which is an amazing fit for my experience and attitude.
That’s awful honestly, did you ever regain that interest in physics later in life?
No, indeed I found a way to skip physics in high school (though this wasn't really why). But, I was interested in Biology, taking almost enough for a minor in it in college.
I'm a self-taught dev now. And, that fits really well for me, despite being completely unrelated to my college degrees. I work mostly with other self-taught, passionate about software people. And I'm loving that.
But, I do have very strong opinions on institutions and pedagogy. I've gotten into some pretty epic arguments about it with my wife, who is a music teacher. And, her experience has been so completely opposite of mine.
From the way she tells it, classical music seems to be the ultimate discipline where structured education is paramount. And, I have such a negative opinion of traditional methods that it's caused some friction.
My physics professor told us once about a lab he had to do when he was a student himself, about measuring the adiabatic gas constant of air. The workload at that point was immense, so lots of students would just write a report and give the textbook answer—and be marked wrong.
It turned out the TA had sabotaged the experiment by putting alcohol in the bottom of the (dark glass) measurement bottle, so the measurement would be of the constant of “air with a fair amount of alcohol vapor in it”, which would give a different constant. And if you actually did the exercise, you'd get that “wrong” number, and that would be the only way to get the lab approved.
That would be a very valuable lab, IF students hadn't been explicitly trained in opposite behaviour for a decade by then.
I lived a very similar experience:
My 4th year computer science professor in software engineering assigned us a four-phase programming assignment for the semester.
My teammate and I spent several sleepless days on the first assignment, and felt some of the requirements were contradictory. Finally we reached out to the professor, and he formally clarified the requirements. We asked him, "well OK, if requirements are unclear, what are we as students supposed to DO?!?" and he answered - exactly what you did; ask the user/client for clarification. "OK, but what if we hadn't, what if we just made assumptions and built on those??". And his eyes twinkled in a gentle smile.
My team mate and I had worked in the industry as summer students at this point, and felt this was the best most realistic course university has offered - not the least because after every phase, you had to switch code with a different team and complete next phase on somebody else's (shoddy, broken, undocumented) code. This course was EXACTLY what "real world" was like - but rest of the class was trained on "Assignment 1, question 1, subquestion A", and wrote a letter of complaint to the Dean.
I understood their perspective, but boy, were they in for a surprise when they joined the workforce :)
>That would be a very valuable lab, IF students hadn't been explicitly trained in opposite behaviour for a decade by then.
I teach students sometimes. I briefly considered whenever I should give them such important lesson. Very briefly: my job is to teach students my specialty, not give them life lessons. Why would I deal with potentially angry students for doing something that's not obvious I'm allowed to do? Hell, it's not even obvious it that would be a "good" (career advancing) lesson.
I agree, that's a valuable skill. But do I, an expert in a narrow (very far removed from any soft skills) field, am the person who should teach it? When some students raise a complaint, how will I explain to the University management that this twist, even though completely unrelated to what I am supposed to teach, was actually a good idea?
I just say, even with good intentions, the incentives are not aligned with teachers going too far out of line.
In one class I took, we were examining a range of car engines for faults and the task was to get it running.
The rumour was that the previous years class had one engine where the ignition rotor arm wire had been replaced by section of coloured plastic which was covered in the usual grease and crap in the housing.
The instructor was looking for persistence and elimination of possibilities rather than actually solving it. But one team did. As long as you solved the others that was enough to complete the class.
As bad as the prior story is, I don't know if intentionally misleading the students is the right way either— what if one had realized the contamination and acting in good faith had cleaned out the bottle? What if they did this afterward and ended up redoing the experiment only to be told they had cheated?
I'm all for exposing students to something unknown, but telling them they're doing X when it's really Y for anything longer than a single lecture ain't it.
You can square that circle by announcing at the beginning of the course that there is going to be some assignment like that, but I'm not telling you which, because the real world doesn't.
I do agree this is a good point; trust is not something that should be simply squandered. Nevertheless, this is still a lesson that needs to be taught and so often students make it to the end without a single teacher that did.
Right but doing it without saying anything is much worse.
Given that a report is supposed to tell what you did and then your calculations and conclusions, you'd better include something as dramatic as “we washed the equipment after getting the wrong results and detecting contamination”…
If you detect it and think it's relevant, that might be worth a note. But "reset and start over" is something that could reasonably be thought of as outside the scope of the report. You're reporting on the experiment, not logging your entire time in the lab.
The trouble with these kinds of games is that they put the more diligent students at a disadvantage. For example, someone might compare their experimental result against the textbook constant, realise it's wrong, and spend much more time trying to identify their "mistake", not realising they've been sabotaged. This puts further pressure on their other work.
One cannot argue that this is fair on the basis that it's the "real world", because all that does is reward the sloppier (middle) approach. It filters the very lazy from the average, but at the expense of the excellent.
Not only that, but an appropriately diligent student might notice with their eyeballs or nose that their bottle contained alcohol, and clean/dry it before performing the experiment.
Given that the labs were with TAs present, at that point, you'd just go to the TA and they'd tell you to write down the number even if it didn't match.
Even as I rather vigorously grumble at the status quo, let it be noted that I celebrate those iconoclasts fighting the good fight all the more for the fact that they are going against the status quo to do so. May their tenacity and creativity ultimately prevail.
> you are never graded on whether you did your best and honestly report what you observe. From grade school on, you are graded on whether or not the grading authority likes the results you got. You might hope that there comes some point in your career where that stops being the case, but as near as I can tell, it literally never does. Right on up to professorships, this is how science really works.
This, so much this. I disliked any lab work in my science classes (in HS/College) for this exact reason. I can't tell you how many numbers I fudged because I wasn't getting the "right" results and there was no time/appetite/interest in figuring out why it was wrong, my options were lie and get a good grade or report what I saw and get a bad grade.
And yes, in college specifically, the equipment we were working was rough. There was so much of "let's ask the other 2 groups near us and we will all shave our numbers a bit to match/make sense".
The same thing happens in organic chemistry. You're graded by your yield. If you put 10 units of A in, cooked up 9.9 units of product B, great job! But if it's 0.01 units, good luck, or 0, heaven save you. of course, they might give you 15 units of A to begin with, you're only to use 10. So at the end of it, you get 9.9 out out of 15 in, and say you only put 10 in. Of course, if you get 14 units of product out of "10" in, you just cut down the product accordingly. I'm pretty sure with organic chemistry lab being a core pre-med course, that this might be more the norm than the exception.
'flunked everyone who claimed they got the supposed "correct" answer to three significant digits because that was impossible.' while I've never seen anyone flunked for this, I certainly have taken off substantial amounts of points, and seen others do the same, for 3 significant figures when 2 is the absolute highest reasonably possible (and realistically, one sig fig was what we actually wanted).
I've run the exact lab you're describing, and I think we gave full credit for anything between 5m/s^2 and 20 m/s^2 provided there was some acknowledgement that this was at odds with what was expected. We very often would check in halfway through class and either tell the kids what they were doing wrong, or even tell them to write something 'this is at odds with literally all known science and I think I don't trust this'. For this particular lab, I've never seen errors as large as the ones you've described, so your lab was likely very poorly set up.
In other cases, I've made extra time (and allow students to come in) in case their numbers were so weird as to be problematic; just depends on the lab. Any teacher worth their salt will do this. It's a shame the teachers you had were terrible and incentivized bad stuff.
If being in a lab has taught me anything, it's that doing good science is often morally difficult. Sticking by your guns is hard.
But you are right in some sense: there are definitely incentives to... misreport. The best we can do as teachers is to reduce those as much as possible and reward kids/students for being honest.
On the other hand my experience as both a graduate and professor teaching students are equally discouraging.
1. Most students don't want to have to think. As a student I was always annoyed that we'd be given exact instructions with an exactly know result to reproduce, while this is generally not how real experiments work. So when I designed an experiment I wrote instructions that reflected more the real life experience, I.e. instead of "place the lens A 10mm from object B" it was "place the lens one focal length away from the object, to know the focal length of your lens you can use a light source at Infinity (far away)." after I left my university the instructions were reverted back because students complained that they didn't get step by step instructions.
2. Students dutifully write down a measurements that is of several orders of magnitude with absolutely no acknowledgement/discussion. I have seen speed of light barely faster than a car and mass of a small piece material in 100s of kg (usually because students forget a nano or giga in a calculation), without any discussion that the result is nonsensical.
3. Similar they make a fit like the one in the OP and don't even discuss the error bars. Or (and that's already the better students) they make a fit with tiny error bars, but get the wrong result (typically due to some mistake like above) and in the discussion say the difference to an expected error is due to measurement error.
Now I also know that there are crappy graduate students who teach because they are teaching the "only get the correct result" but it's often very difficult to improve teaching because students will immediately complain that they have to adjust to changes.
The worst is college science classes where sometimes the provided equipment and/or procedures aren't even correct, and the professor isn't around and you're dealing with a TA who is just as confused as you are.
So you debate with yourself between writing down the effect you got (and trusting that you will be rewarded for integrity and effort and rigor), or simply writing down what you know the effect was supposed to be.
Most people (smartly) do the latter.
>Roll a ball off of a standard classroom table. Use a 1990s wristwatch's stopwatch mechanism to start the clock when the ball rolls of the table. Stop the stopwatch when the ball hits the floor.
Our class had some kind of device that would either punch a hole, or make a mark on paper at a regular time interval. We attached a narrow strip of paper to the ball, and let it pull through the marking device as it fell from the bench to the floor. We then measured the distance between each mark, noting that the distance increased with each interval, using this to calculate g. I don't recall anything more than that, or how I did on that lab. I received a 50 one marking period for lack of handing in labs, but had a 90+ average otherwise in the class.
In the UK we called it ticket tape and it was terrible. The devices barely worked and they cause a bunch of friction so you end up calculating a value of 'g' that's off by like 30%.
I think officially, we called it ticker tape, as in stock ticker - it was originally used to record stock prices transmit by telegraph.
That's an interesting way to measure the passage of time -- just use something that produces a "regular distance" and derive a way from kinematics to calculate the acceleration from the change in the distance.
The way boats historically measured speed was by dragging a rope behind them. The rope has knots tied with exact spacing. You drop one end of the rope in the water, and count how many knots pass you in a given time. That's then your speed in knots.
Well the other problem is knowing where you are. The sun/stars can give you latitude. Longitude was nearly impossible until the advent of the marine chronometer in the latter part of the 18th century, and not "standard" on ships until the mid-1800s. There were earlier versions, which had poor accuracy and were not much better than dead reckoning
The rope has a mechanism for creating drag (a wooden board) at the end, and regularly spaced knots. You throw the board in the water, let the rope play out through your hands, and count the knots as they pass through your hands while watching a timer.
Ticker tape timer. My class had the same thing for the same experiment.
... like something that burns a hole in the paper with a spark or marks thermal paper with a burst of heat.
That’s pretty bad. On top of being unfair, it was a total missed opportunity to talk about the law of large numbers (I wonder if they could get a decent sample by combining everybody’s measurements) or skew (maybe everybody is a couple milliseconds too low just based on reaction time).
Or there could be some air resistance if you used, like, ping-pong balls.
Correct. Ask anyone who plays blitz/bullet chess online. Games are won and lost in the final second of gameplay.
>The lesson is taught early and often. It often sort of baffles me when other people are baffled at how often this happens in science,
Math and some sciences have the aura of definitive right and wrong, so even though by college everyone knows the expression "give the answer the teacher wants to hear", they just think in those subjects the teacher has access to absolute answers.
The primary thing taught by our schooling system (and 2nd place isn't even close) is bureaucracy obedience. This has the obvious effects, but one of the subtler ones is deference to "science" as an authority requiring obedience rather than the process of figuring shit out.
I studied Engineering rather that physics. In our lab reports we were expected to include a discussion of the results and the experimental method. It was basically expected that the report should include associated commentary around potential sources of error and modifications to improve the experimental accuracy.
I don't recall ever being marked down for failing to obtain the "correct" result the impression I came away with was so long as you were thorough in your discussion and analysis the exact result was less important.
I can remember my second year thermodynamics class had a fairly complicated lab which involved taking measurements from inflow and outflow of various heat exchangers in a variety of configurations (Counter flow, Cross flow etc) then computing the efficiency of each configuration. I recall getting into minutiae in the report about assumed friction factors and suggested methods to asses the smoothness of the pvc pipes etc. to improve the accuracy of calculations etc.
I had a physics class in my high school. 2014? 2015? Around then.
The teacher had us using a stopwatch on our phones. We would repeat the experiment several times and average the result, because manually doing a stopwatch was terrible- multiple samples kinda helped.
My group figured out we could get things way more accurate if we videoed the experiment in slow-motion with a phone, keeping a digital stopwatch in frame. It took an extra step of math, subtracting out the start time, but in slow motion we could be accurate to 1/120th of a second. Our results were easily the most precise in the class. Equipment can make a huge difference, and slow motion video was considerably more accurate than “Mike trying to time it right”
I'm certainly not going to defend your teacher or your experience, especially at the high school level. That's too soon. And I also remember being indignant for a similar experience in analytical chemistry.
But... there's a point in one's development as a science student, where science becomes more nuanced than "doing your best and honestly reporting what you observe." Those things will always be there of course. But in an experimental science, doing an experiment and getting accurate results is a vital skill, or you'll never make progress.
Naturally you have no standard for checking a measurement whose result is truly unknown, but you can insert the equivalent of breakpoints where you make sure that the same data do reproduce known results. Ironically for the discussion here, those are called "gravity tests." Students need to know at some point if they're going to like the experimental side of science. Getting things right is part of it. Some people don't belong in the lab.
I happen to be stuck at the "gravity test" level in my day job. My experiment produced a calibration that's reproducible, and that I could use, but it doesn't make sense. I'm not going to move forward until it does.
The problem with a lot of teaching is that the purpose of the lesson is never explained, and the nuanced view is never spelled out.
In the first grade I knew exactly where on my fingers the width was an inch or a cm.
I got called up in front of class and punished for cheating on a length estimation assignment.
They told everyone I was a cheater that used a ruler :P
Besides contributing to the sob stories, my point is maybe some of those kids got lucky with a good measurement/timer. Sorry you had a really bad teacher.
I had a similar experience measuring gravity in high school. Our method was using a ticker timer.
One of these. https://www.physicsforums.com/threads/the-history-of-ticker-...
The inevitable happened, after the years of classroom abuse the timer provided enough friction that the falling object swung on the paper like a pendulum and slowly made its way to the ground over the course of about 5 seconds.
We analysed the meaningless dots on the paper and wrote up a calculation of gravity of 9.6m/s^2 attributing the 0.2ish to 'possible friction or accuracy of the timer'
This taught me more about science than I care to think about.
That's a poor way to measure g. In multiple schools I went to, the standard was to measure g via a pendulum (I think measuring the period).
I measured a 9.86[1] :-) Mostly dumb luck. But most people in the class would get decently close (9-10.5).
[1] The correct value is closer to 9.81.
> no matter how many times you are solemnly assured otherwise, you are never graded on whether you did your best and honestly report what you observe. From grade school on, you are graded on whether or not the grading authority likes the results you got.
Wouldnt've helped me before late high school, but that "whether or not the grading authority likes the results you got" part cuts both ways. That is, if you put some extra effort into presentation, you can get at least some of authorities to recognize your effort. Or, if you're really good, you can even bullshit wrong results past them, as long as you give a strong impression of competence.
Or at least that's what undergrad studies taught me; for random reason I went into overkill for some assignments, and I quickly discovered this worked regardless of the validity of my results.
I guess a big part of it is that most other people a) don't really put in much effort, and b) don't see any importance of the work in larger context. So I found that if I showed (or faked) either, I was set; show both, even better.
(Though it didn't work 100% well. I distinctly remember spending a lot of time figuring out how to simulate lexical scope and lambdas with strings & eval in Lotus notes. My professor was impressed, even suggesting I write the details up, but then she proceeded to fail me on the exercise anyway, because I didn't actually do half of the boring things I was supposed to.)
(It also taught me to recognize when someone else's deploying smokescreens of competence to pass lazy or bad results.)
Well, on the flip side, I had a couple of classes in which we were supposed to "critique" papers, for the laudable purpose of learning critical thinking skills and how to evaluate papers.
We also were supposed to read the greatest papers in the field to learn about the field from the primary sources, also a laudable purpose.
Unfortunately, these two things were put together, and we were expected to produce "critiques" of the greatest papers in the field.
Now, I've told this story a couple of times, and always some anklebiter jumps up from the replies to point out that even the greatest papers can have mistakes or be improved or whatever. Which is in principle true. But when Einstein comes up to you and for the first time in world history explicates his new theory of relativity, you aren't doing him, yourself, or the world a favor by "critiquing" his choice of variable names, quibbling about his phrasing, or criticizing him for not immediately knowing how to explain it the way physicists will explain it after over 120 years of chewing on it.
In practice, there is no practical way to "critique" these papers. They are the ones that have slugged it out with hundreds of thousands of other papers to be getting recommended to undergraduate students 20-40 years later. There is no reason to believe that anything a college junior, even one from decades down the line, is going to give any suggestions that can improve such papers.
So what I learned is that I can just deploy a formula: 1. Summarize the paper quickly, ideally with some tidbit in it that proves you really read it 2. Use my decades of foresight to complain that the author didn't do in this paper something the field built on it later, quite possibly led by the same author (I dunno, I didn't check of course, I'm just complaining) 3. Say "more research is needed"... it's a cliche for a reason -> Get an A every single time, despite putting no real cognitive effort into the critique.
I did at least read the papers for real, and that was fine, but my "critique" was 100% presentation, 100% genuflection of the ritual words of science, knowingly shorn of meaning. Heck, even now I don't think I feel bad about that; I just delivered what was asked for, after raising the objection once. At least we read some of the literature, and that is a skill that has served me for real, in real life, even though I did not go into academia proper.
I used to teach math to 5th graders about angles. I let them draw a triangle and measure the angles with a protractor, then calculate the sum. The sum is usually around 177 or 178 degrees.
The story of Isaac Asimov's "shotgun curve" is relevant:
https://archive.org/details/Fantasy_Science_Fiction_v056n06_...
For typical distances (say the height of a table or a shelf) the time should be on the order of a fraction of a second. There's a couple hundred ms delay in the human auditory + motor system, which is a sizable fration of the time you're trying to measure and one would have to try to account for (but not all that easy, especially for a HS physics class).
In my university we had a more precise setup for that. It was some sort of weight on a rail at a known incline, and a digital timer with two sensors known distance apart that start and stop it.
Yet in my class we still had results as low as 7 and as high as 12. We all got passing grades. But the protocol for these lab assignments was always such that you had to have your "measurements sheet" signed by the professor, and you turned it in with your report later.
Similar here. What the teachers where actually looking at was if the calculations and error analysis were done right.
Having recently gotten into quantum and listening to a lot of audiobooks on the history of it, that’s one of biggest takeaways for me. So many major advances in theory that languished for years because of the politics of the day of the personal opinions of their advisor, only for a physicist with greater standing to rediscover the same thing later and finally get it some attention. (Hugh Everett and David Bohm being two examples)
> Right on up to professorships, this is how science really works.
Why I am making my exit from academia and research entirely as soon as I finish my PhD. The system is filled with wonderful, intelligent people but sadly simultaneously rotten to the core. It in fact, did not get better as I moved from undergrad to grad school.
I think if you showed not only the point estimate, but also some measure of uncertainty like standard deviation, it should have given you a passing grade. It's hard to say why an answer like 6.8 +- 5 is wrong.
Even if you don't yet have formal statistical chops, it should be at least possible to show cumulative distribution function of results that will convey the story better than a single answer with overly optimistic implied precision.
This is early high school. We didn't have error bars yet, we just took an average. I just used that as a convenient way to describe how erratic our numbers were. If 6.8 is the average you know we had some low numbers in there. And some nice high ones, too.
You're certainly correct that the true value would have been in our error bars, and one of those good teachers I acknowledge the existence of in my large paragraph, sarcastic as it may be, could conceivably have had us run such a garbage experiment and shown that as bad as it was, our error bars still did contain the correct value for probably all but one student or something like that. There's some valuable truth in that result too. Cutting edge science is often in some sense equivalently the result of bodging together a lot of results that in 30 year's hindsight will also be recognized as garbage methodology and experiments, not because the cutting edge researchers are bad people but because they were the ones pushing the frontier and building the very tools that later people would use to do those precision experiments with later. I always try to remember the context of early experiments when reading about them decades later.
It would also have been interesting to combine all the data together and see what happened. There's a decent chance that would have been at least reasonably close to the real value despite all the garbage data, which again would have been an interesting and vivid lesson.
This is part of the reason this is something that stuck with me. There were so many better things to do than just fail someone for not lying about having gotten the "correct" result. I'm not emotional about anything done to me over 30 years ago, but I'm annoyed in the here and now that this is still endemic to the field and the educational process, and this is some small effort to help push that along to being fixed.
It's honestly kind of bullshit because the bedrock of a lot of my work is being realistic, and if I had such a piece of crap equipment I would have gladly reported the 6.8 meters per second squared and then turned around and identified all of the problems with my setup right down to characterizing the lag time on the stopwatch start.
In fact one of the trickiest problems I had to resolve once was to show that the reason a piece of equipment couldn't accurately accumulate a volume from a very small flow was because of the fixed-point decimal place they chose. And part of how I did that was by optimizing a measurement device for the compliance of a fixed tube until I got really good, consistent results. Because I knew that those numbers were actually really good it came down to how we were doing math in the computer and then I just had to do an analysis of all of the accumulation and other math to determine what the accumulated error was. It turned out to be in really good agreement with what the device was doing.
All of that came from our initial recognition that the measured quantity was wrong for some reason.
When we did that in high school, we took long exposure photos with a strobe light and measured where the ball was at each strobe interval. I think it worked out well.
I'm sure nowadays the experiment would just be one slow-mo video on your phone.
Brings back memories!
In my case it was a slide on an air cushioned aluminum beam.
And the interesting part was that for some reason, if we pulled it up towards the top, behind some point it used shorter time to travel across the whole beam.
I put quite some effort into figuring out why, repeating it again and again, studied the beam to see if there was any irregularities, brainstormed on why this happened.
My physics teacher really liked that at least some of his students had dug into it (I think we weren't the only group) and made it very clear in the feedback (he did not mention who had gotten it wrong, just that some had observed this and looked into it instead of covering it up or throwing away the data we didn't like).
Didn't exactly enjoy school, but people like him made it a lot better.
Have a complete different experience. As a physical major, did a famous Millikan's oil drop experiment. Am a terrible experimentalist (went on to do my PhD in theoretical physics), so we got a charge of about 1/3 of the charge of an electron. Now, as I did not get a Nobel prize, I did not actually measure the charge of a single quark, but still got good enough grades for this study.
When I did the mandatory lab exercises in physics, there was a more benign variant of that problem: the conventional value had to fall inside the error interval. However, it was allowed to add additional errors with a good explanation (...some creativity). I really didn't like to increase the estimated errors to make the result work, and I think the (unimportant) grades were reduced for doing it.
I remember being really consistent with the stopwatch in one exercise, so sadly the spread of measurements (implying a natural uncertainty) was small. That was bad!
In my high school, without naming any names, the teacher told us all that anyone who changed their results to 9.81m/s^2 was doing science incorrectly. And we were graded on our analysis of the experimental procedure, or something like that.
https://v.cx/2010/04/feynman-brazil-education
> Then I held up the elementary physics textbook they were using.
> There are no experimental results mentioned anywhere in this book, except in one place where there is a ball, rolling down an inclined plane, in which it says how far the ball got after one second, two seconds, three seconds, and so on.
> The numbers have ‘errors’ in them – that is, if you look at them, you think you’re looking at experimental results, because the numbers are a little above, or a little below, the theoretical values. The book even talks about having to correct the experimental errors – very fine.
> The trouble is, when you calculate the value of the acceleration constant from these values, you get the right answer.
> But a ball rolling down an inclined plane, if it is actually done, has an inertia to get it to turn, and will, if you do the experiment, produce five-sevenths of the right answer, because of the extra energy needed to go into the rotation of the ball.
> Therefore this single example of experimental ‘results’ is obtained from a fake experiment.
> Nobody had rolled such a ball, or they would never have gotten those results!
Reading your post, I now realize education is dysfunctional in the entire world, not just in my country. Small comfort.
Interesting. If that is correct and you take OPs value, 6.8 / 5 * 7 = 9.5 which is pretty damn close. So his failed grade was for the only non-cheated result?
I had a similar experience in Physics 101 and Chemistry 101. The labs were chaotic and had limited time. If you were even a little bit unlucky it would be impossible to even finish them let alone get remotely decent results.
I'm convinced 60% of the class faked results or copied many results from previous year's students.
This is how I remember my own undergrad physics and chemistry labs: Terrible equipment and no time. The students who turned in faked but plausible data that looked like what the professor expected to see would get A's and the students who actually did the experiments and reported the crap they measured got lower grades. Everyone just learned the wrong lesson: Figure out what the data should look like and fake it.
I got a D in a highschool Biology Genetics Lab working with Fruit Flies because our Chi Squared p-value was a little less than the common significance value of 0.05.
Our results were close enough that we could still easily determine the phenotype and genotype of the parent and grandparent Fruit Flies (red/black eyes), but it was kind of a bummer to be punished in a highly error prone experiment (flies dying from too much ether, flies flying away, flies getting stuck in food and dying, etc).
It did teach me to be more careful when running experiments but I probably would have given myself a C, not a D
Did the same. Least squares got me to 9.7
> The lesson is taught early and often. It often sort of baffles me when other people are baffled at how often this happens in science, because it more-or-less always happens. Science proceeds despite this, not because of it.
I think we should definitely not learn from this that science still works despite those things. Because then it's easy to just say it is what it is. I think it's much more helpful to be critical of the scientific process (scientific policies in particular) and see how it can be improved. As I said many times before here on Hacker News, basically nothing in science has changed since papers like Why Most Published Research Findings Are False by Ioannidis have come out. I think we as civilians should demand more from science than a bunch of false papers behind paywalls.
On a side note, one thing every single one of my peers who have pursued a creative degree have echoed, be it architecture, literature, graphic design, industial design etc. - is that the only way to get a good grade is to find out what your professors personal preferences and opinions are and be in total and utter agreement with them.
Any amount of critical views tends to result in your work torn to pieces and you getting a shitty grade.
Your architecture professor likes turrets? Then better put them even on the chicken coop - that way he'll no you're one of the students who gets it.
Your lit professor loves a certain philosopher? - better not point out that you find his arguments circular, ponderous and betraying a lack of broad perspective.
This has been utterly weird to me considering I have encountered way less (but not zero) of this thing in engineering, and art is supposed to be about developing your self-expression, but I've heard this criticism so many times from so many places and formulated so strongly. I've had many people flat out leave their educations because of this, with others just quietly powering through.
This in of itself has changed my view of art education, and I've told many people to stay away from these places not because of the usual 'it's useless and you'll starve to death arguments' but because of this.
At my high school, somehow physics was the dumb jock science course. I think it was because the head football coach taught physics for decades before retiring my sophomore year. Anyway, as a kid who was doing well in school and was headed for college, it was a natural decision for me to not bother taking physics and study for the AP test on my own. But one day a kid showed up in one of my classes with a hall pass for me to go to the physics classroom. The new teacher needed my help.
She had planned on teaching a lab on gravity and acceleration that day, but she was having trouble getting the right experimental results. Now, this story is not going to reflect well on her, so I want to say up front that she was already taking physics education at my high school to unprecedented heights by 1) trying out the lab on her own before trying to teach it, and 2) actually giving a shit about the results. I doubt the coach who had previously taught physics ever bothered to do any of the experiments himself, and I'm guessing everyone who ever turned in a lab report to him got an A regardless of the contents.
So there I am, a future physics major walking into a physics classroom for the first time in my academic career. I'm nervous because I have a reputation as a smart kid, and specifically as a smart science and math kid, but I was better with math and theory than with machines and measurements. I'm excited about getting to look smart in front of the other kids, but I'm also sweating bullets that there might be something about the equipment that I might not be able to figure out. So I ask her to show me what the experiment is and how she's doing it.
The experimental setup is a small but heavy piece of metal attached to a long, thin strip of the kind of paper used for carbon copies. (Or carbonless copies maybe. You know the paper where you write on one sheet, and there's a pressure-sensitive sheet underneath that creates a copy? It was a long strip of that pressure-sensitive paper.) The final piece of the experimental setup was a loud clacking thing that the strip of paper fed through. When it was turned on, a little hammer inside it slammed down every 1/4 of a second. The idea was, as the paper traveled through, the hammer left a mark every 1/4 of a second, and you could measure how far the paper traveled in each interval between the hammer strikes. Much more precise than a stopwatch!
You have already figured out how the experiment works. You hold the clacker at a fixed height against the wall or some other high fixed point, thread the weight end of the paper through it, turn the clacker on, drop the weight, and the clacker leaves marks on the paper that let you calculate g.
The teacher understood this, to an extent. But she decided that it would be less of a logistical hassle if the students did the experiment at their lab tables, by holding the clacker on the table and pulling the weight horizontally across the table with their hand. She tried this quite a few times herself, plotted the numbers, and could not get the plot to look like a parabola like in the textbook. I explained to her, "We're measuring gravity, so gravity has to do the work. If we move it with our hands, we're just measuring our hands. If gravity moves it, we'll measure gravity." We tried it, it worked, and she sent me back to whatever class I had been in when she sent for me.
Now I feel lucky to have gone to a school where universally the teachers actually understood the material they were teaching. The only poor teaching I had to face was on the teaching aspects, and this was only from a minority of teachers.
So did you let this go without protest? Why not escalate it if it was clearly so unreasonable?
Sounds like there was more nuance to the story.
Because my policy in childhood was to bend like the willow and not break like the oak. Not phrased in those words, and not quite as consciously chosen as it is now, but it was my policy, and for the most part I stand by it. Modern me, looking back with an engineer's rather cold cost/benefits analysis, sees way more cost than any possible benefit, so I might refine my past self's reasons but I'd still take the same actions.
Fortunately, this was closer to a one-off problem in an otherwise acceptable class rather than a systematic issue.
He was just a kid, man.
"Escalating" in American high school is a good way to increase your consequences to no benefit.
I escalated a very similar thing with a college professor--in a social sciences class.
She did not update my score, she argued a while in front of class, and when she lost the argument, said I could take it up with her supervisor.
I declined (it was one question on a larger test)