That was both satisfying - I wasn't promised more than what I got - and disappointing - I totally expected something fancier from "Orgamists", something I couldn't do myself.
Most anyone can "do" origami given instructions. But no one came up with a single fold design until David Mitchell did -- minimalist designs are only easy in hindsight. At the other extreme we have Robert Lang's amazing creations (which may require many more than four complex parameters!).
The medieval astronomers had it right: epicycles work! However, they didn't reflect the actual mechanism behind planet movements, but merely predicted them. "Circular regression", essentially. The ancient Greeks used threaded pegs to make puppet shows etc.: the first programmable robots. I'd like to see possible emulations of such.
I'm not sure counting the number of real numbers that you have to input is quite the right way to think about this.
I could encode instructions to draw an arbitrary shape in a single real number if I wanted: .00011110 could be interpreted as a square if I take pairs of digits to be successive (x,y) coordinates (my example becomes (0,0), (0,1), (1,1), (1,0)).
Degrees of freedom expresses a bound on the complexity of things you can represent.
The motivation of this work was a conversation about the over complexity of a scentific model. You literally count the free params and if there are loads of parameters you need zillions of observations to pin them down due to the curse of dimensionality.
Non-parametric statistics work by fitting models of infinite degrees of freedom. But then you need fancy math to figure out how complex your model currently is and how your data supports it.
One might also argue that choosing a non-continuous sequence of Fourier-coefficients is not perfectly honest, since it allows for encoding information in the coefficient indices.
And the professional digital modeler that just found a way to cut 20 triangles out of the trunk wireframe just rolls their eyes and goes back to fine-tuning the skeleton so that the active forelimb doesn't pinch in the "skull-crush execution" animation....
The described technique is interesting, but elephants are 3-dimensional objects with a somewhat more detailed contour, so I'm going to have to declare that the well-known saying remains unimplemented. I think Dyson could have retorted, "Yes, but my model will be finished long before you have discovered your fourth parameter."
> Yes, but my model will be finished long before you have discovered your fourth parameter
This attitude is one of the issues I encounter among data scientists which limits their impact in business to lower ticket decisions (like individual recommendation systems) vs. bigger budget allocation decisions in business and the same is the case with physics. Being able to explain drivers and protecting against catastrophic overfit failure is much more important than getting a great predictive fit in a limited dataset.
That seems like cheating. If you can use quaternions to squeeze out more degrees of freedom, you could also use octonions. You could use any number of additional dimensions that square to -1.
I've tried to add neural time warping to it, but it did not help yet. L1 loss on the parameters did help reducing the parameter count on the fancy elephant task. Here are a couple of failed runs:
I'm currently taking Andrew Ng's Machine Learning course on Coursera; I immediately thought of using logistic regression to find this shape algorithmically. I know the premise of the paper is "with four parameters." But: with enough polynomial terms it should be easy to get much closer to the example in figure a than the one shown in figure b, no?
And a blind man felt the elephant and declared, "It's a computer screen."
https://en.wikipedia.org/wiki/Blind_men_and_an_elephant
Origamists create an elephant with nothing more than one fold! See some minimalist elephant designs by David Mitchell and Paul Jackson:
http://www.origamiheaven.com/pdfs/elephantsextreme.pdf
That was both satisfying - I wasn't promised more than what I got - and disappointing - I totally expected something fancier from "Orgamists", something I couldn't do myself.
Most anyone can "do" origami given instructions. But no one came up with a single fold design until David Mitchell did -- minimalist designs are only easy in hindsight. At the other extreme we have Robert Lang's amazing creations (which may require many more than four complex parameters!).
http://www.langorigami.com/artworks/mammals
This is beautiful! Any hints where to find more like this?
http://www.origamiheaven.com/ & http://www.origami-instructions.com/ have many examples with instructions for single sheet designs.
Visualization of the same technique:
https://www.reddit.com/r/Art/comments/7wztif/generative_art_...
The medieval astronomers had it right: epicycles work! However, they didn't reflect the actual mechanism behind planet movements, but merely predicted them. "Circular regression", essentially. The ancient Greeks used threaded pegs to make puppet shows etc.: the first programmable robots. I'd like to see possible emulations of such.
Homer Simpson in 1000 epicycles: https://www.youtube.com/watch?v=QVuU2YCwHjw
Slightly cheating by using complex numbers so there is 8 degrees of freedom instead of 4. But bravo anyway, it was a great exercise.
I'm not sure counting the number of real numbers that you have to input is quite the right way to think about this.
I could encode instructions to draw an arbitrary shape in a single real number if I wanted: .00011110 could be interpreted as a square if I take pairs of digits to be successive (x,y) coordinates (my example becomes (0,0), (0,1), (1,1), (1,0)).
Degrees of freedom expresses a bound on the complexity of things you can represent.
The motivation of this work was a conversation about the over complexity of a scentific model. You literally count the free params and if there are loads of parameters you need zillions of observations to pin them down due to the curse of dimensionality.
It's the same concept in parametric statistics:
https://en.m.wikipedia.org/wiki/Degrees_of_freedom_(statisti...
Non-parametric statistics work by fitting models of infinite degrees of freedom. But then you need fancy math to figure out how complex your model currently is and how your data supports it.
One might also argue that choosing a non-continuous sequence of Fourier-coefficients is not perfectly honest, since it allows for encoding information in the coefficient indices.
This parametric equation set draws a hamburger: x=sin(tan(t)) and y=cos(t)
Picture: http://uncyclopedia.wikia.com/wiki/File:Hamburger_plot_ies.P...
And the professional digital modeler that just found a way to cut 20 triangles out of the trunk wireframe just rolls their eyes and goes back to fine-tuning the skeleton so that the active forelimb doesn't pinch in the "skull-crush execution" animation....
The described technique is interesting, but elephants are 3-dimensional objects with a somewhat more detailed contour, so I'm going to have to declare that the well-known saying remains unimplemented. I think Dyson could have retorted, "Yes, but my model will be finished long before you have discovered your fourth parameter."
> Yes, but my model will be finished long before you have discovered your fourth parameter
This attitude is one of the issues I encounter among data scientists which limits their impact in business to lower ticket decisions (like individual recommendation systems) vs. bigger budget allocation decisions in business and the same is the case with physics. Being able to explain drivers and protecting against catastrophic overfit failure is much more important than getting a great predictive fit in a limited dataset.
Well, if they can use complex numbers, I think quaternions[0] are fair game for a 3D elephant.
[0]https://en.wikipedia.org/wiki/Quaternion
That seems like cheating. If you can use quaternions to squeeze out more degrees of freedom, you could also use octonions. You could use any number of additional dimensions that square to -1.
We lose features too, as we climb in complexity. Lucky 10000: https://en.wikipedia.org/wiki/Cayley–Dickson_construction
but if you want a normed ring division algebra you must stop at 8 dimensions.
I thought those things weren't so easy to come by -- that was why Hamilton was excited enough to carve it into a bridge wasn't it?
I think the difficulty is actually in arranging them such that they combine to make an interesting algebra.
A mechanical harmonic (Fourier) analysis device: https://www.youtube.com/watch?v=NAsM30MAHLg
Bonus points for the first to reimplement this in PyTorch or Tensorflow.
Here you go:
https://github.com/983/Elephant/tree/master
Congratulations, full bonus points awarded!
Seriously, that's very cool. Well done.
Thanks :)
Awesome! You are amazingly fast!
I've tried to add neural time warping to it, but it did not help yet. L1 loss on the parameters did help reducing the parameter count on the fancy elephant task. Here are a couple of failed runs:
https://users.renyi.hu/~daniel/elephants/
Heh, I find these more interesting than the successful ones.
See also: Mechanical Laser Show [1]
[1] https://youtu.be/_dtBUiaAqRE
I'm currently taking Andrew Ng's Machine Learning course on Coursera; I immediately thought of using logistic regression to find this shape algorithmically. I know the premise of the paper is "with four parameters." But: with enough polynomial terms it should be easy to get much closer to the example in figure a than the one shown in figure b, no?
this is actually a standard first example in Fourier analysis, since more than a hundred years ago
Are all parameters not described equals to zero ?