connor is a user on cybre.space. You can follow them or interact with them if you have an account anywhere in the fediverse.

connor @notwa@cybre.space

Pinned ping

hi, i'm connor. :witches_town:

i'm interested in , , , , learning and playing with math. i enjoy writing , , and .

sometimes i hack games for the , but more generally i follow wherever my interests take me. i tend to avoid big projects because of that. i ramble a lot about whatever i'm working on or learning about.

i've helped my sister raise a dozen pet ! i also love , , , and .

that grammar is awful but you get the idea

maybe i should get a big poster to put there

i keep thinking there's a window to the right of my computer chair like at the old place but there isn't, aaaa

maybe shorthand is not the right word, it's more like a max of 3 words per title

adding shorthand titles to all the nondescript paper filenames I've accumulated

I keep remembering places from my dreams and it's a strange feeling knowing they don't exist in any physical manifestation

there's a lot of stuff I like and agree with but I don't really talk about stuff I'm not actively into so it probably just looks weird from the outside

sorry i seem to post in my own bubble so often

I guess a cool takeaway from this is you can squeeze enough information from MNIST-like datasets into a 28 dimensional vector by a linear transformation and get reasonable accuracies. I'll definitely come back to this...

i wanna photoshop ่‰ over the ็ on the Bomberman 64 logo but i can't be bothered

i can't remember what my baseline is though so maybe this is awful lol

90.0% accuracy on fashion-mnist ๐Ÿ‘ not bad for no convolutions

fairs? fares? crap, i can never remember

[just woke up] yay now it transfers flawlessly. i wonder how this fairs with regression problems? since nothing here is classification-specific

connor relayed

Well, this is a pleasant surprise. Distance is finally closing in on the end of the tunnel after 6 long years. refract.com/post/175365965296/

(Shoutouts to @snowstate who is the only member of the Distance dev team on the fediverse, and who has undoubtedly been cranking out some good stuff behind the scenes. o/)

oh i see, i'm missing the initial subtraction by the mean

been playing around with simpler techniques today. PCA->PolynomialFeatures->LinearRegression. since the first and last are essentially just matrix multiplies, I'm trying training them further by backprop. it seems to work well, but the transition from sklearn to onn (my toy neural network library) doesn't seem to be quite right and I'm not sure why yet.

METZ is like an audio stressball

weird. computing the taylor series of sympy.functions.tanh seems to get exponentially slower by the 10th order. if i write it with exponents, the speed is totally fine!