"one hour coding interviews" are absolutely horseshit.
and not for the reason you think: you wanna give me a "code interview", make it 8 hours.
A full work day.
Sit with me for 8 hours -- and 8 hours i'm *sure* you're getting paid for -- and get the straight dope.
This whole "ascertaining a decade of skill in an hour or less" is just an insult at this point.
@somarasu so I don't know enough to tell if things are ever/always done this way, but I was kind of floored to watch a comedic video where someone "cheated" a remote coding interview by looking up references so the interviewers didn't notice? I was like, I'm pretty sure devs use references in their actual work (though maybe they're Real Men about it unlike my weenie self and use man instead of Stack Overflow), what the hell is the use of a closed-book coding test?? If anything they should hire that guy precisely because he has enough sense to ignore bullshit rules and look things up!
@ljwrites This is what enrages me: I guarantee the video to which you reference, has a comment section CHALKED FULL of cis white men who know jack shit about anything, yet still have the tiny marbles to laugh at someone -- who is doing it comedically or not -- someone utilizing the SAME resources they use to bitch about women not fucking them.
You have a resource to call people 'cucks' for not being shit to women for No Reason ™️
Imma use it to get a job.
Same as same.
Only it *really* isnt.
programming rant, computational complexity
@ljwrites @somarasu and also all these Facezongoogappflix interviewers usually don't seem to understand the terms they are using... like, big O notation is also called asymptotic notation for a reason, and the difference between O(log(N)) and O(1) only starts to matters for so enormous values of N that your whiteboard non-distributed code will not be able to realistically handle them anyway.
Like, in this video they're using arrays, and java arrays are indexed with 32 bits (from what I've found in google), so you already know that binary logarithm of N is no larger than 32, and therefore using a state-of-art algorithm which takes 33 * N cycles (O(log(N)) is always worse even from the standpoint of performance than using a really simple algorithm which takes log(N) * N.
But no interviewer ever pesters you with questions about what constant stands in front of that O, all that they're worried about is how many logarithms of logarithms are inside of that O, even if all these logarithms are not larger than a known small constant. They never say "oh, your solution is 1000*N, make it 500*N", but they will absolutely lose it if you make it 10*log(log(log(N)))*N instead.
the mastodon instance at cybre.space is retired
see the end-of-life plan for details: https://cybre.space/~chr/cybre-space-eol
@somarasu ha! Probably. Wish I could find it again because it was a pretty funny depiction of a character in an arguably absurd situation. Instead I found a completely different video about a coding interview and I found it somehow relatable even though the technical details went whoosh over my head. You'll get much more out of it lol. https://youtu.be/kVgy1GSDHG8