DrBazza 10 days ago

When I was at university, we did a different Jupiter simulation - the whole planet. We were fortunate enough to have a comet smack into it and ring it like a bell.

Then a few of my senior colleagues used the observations in asteroseismology models (a generalised helioseismology model really) to study the interior.

https://en.wikipedia.org/wiki/Asteroseismology

https://en.wikipedia.org/wiki/Comet_Shoemaker%E2%80%93Levy_9...

  • Vicinity9635 9 days ago

    Jupiter is basically a big broom for Sol system. It's quite a nice GSV, don't you think?

cmehdy 10 days ago

This article was a joy to read, both for explanations and visuals. I'm not knowledgeable at all in visual generatio but I'm now wondering about other uses to extend the method.

What other shapes can be coupled (with the technique to create those various storms) in order to create large-scale transitions, where for example a large vortex would follow a sigmoid over the other zones.

Or even in what subtle ways could the visuals follow the envelope of a Hans-Zimmeresque audio background..

Thanks for having shared this blog!

e_dziewanowski 10 days ago

Hello everyone! I'm the author of the article. First of all, thank you so much for sharing it here. I've been taking note of the feedback - I'll try to fix the issue with contrast and other UX problems. If there are any specific suggestions or further feedback you have, please feel free to reach out to me. Thanks again for taking the time to read and share the article!

  • semi-extrinsic 10 days ago

    Fluid mechanics guy here. Let me first say this looks really nice overall!

    The part that has probably highest potential for improvement is the sharpening, the artifacts there look a bit weird still.

    Physically speaking, what you see on Jupiter (and on a river) is an interfacial flow. There is a divergence-free bulk flow underneath, but the interfacial flow itself has a lot of divergence. Upwellings have positive divergence and supply fresh stuff (colour!), downdrafts have negative divergence and consume stuff/colour.

    But wait! You are using curl noise for your vector field! Of course the divergence is then zero everywhere!

    If you take just the gradient of the scalar noise field you use for your curl noise, this will have lots of divergence and "compatible shape". Just scale this down a bit and mix with your curl noise.

    And then finally take the value of your scalar noise field, scale it to be symmetric around zero, and use this to determine how much color to add/remove.

    I think this will remove your need for sharpening entirely.

    Disclaimer: this is just top-of-my-head while walking home.

    • e_dziewanowski 10 days ago

      Really great observations - thank you! I already use the method you described - curl is mixed with some amount of gradient to artificially bring color from the bottom layers. It can be observed at the center of the red cyclone in the last YT clip. Keep in mind - i wasn't going for true fluid mechanics - I just used some of the flow patterns observed in real fluids and layered them on top of each other to give the illusion of a more complex behavior. As for the sharpening - it is used to counteract the blurring effect of interpolating the color texture every frame.

  • smcameron 10 days ago

    Nice work. You briefly mentioned curl noise... About 10 years ago I wrote gaseous-giganticus[1] which uses curl noise to create gas-giant planet textures. They don't move, like yours, but don't look too bad (and looking at Jupiter, you can't really see that move over small time scales anyway.) Some animation is possible[2] with gaseous-giganticus, but not in real time, as it's all done on the CPU, and it doesn't really sustain over time, as it starts off looking very fuzzy, resolves into something pretty nice, then gets weird. Here is some more output from early days: https://imgur.com/a/9LipP

    Here are some slides about the development of gaseous-giganticus (best viewed with a real computer, not on a phone, as it uses arrow keys to navigate the slides): http://smcameron.github.io/space-nerds-in-space/gaseous-giga...

    [1] https://github.com/smcameron/gaseous-giganticus [2] https://imgur.com/mqCwMeI

noSyncCloud 10 days ago

Props for a site of that visual complexity that was performant, visually appealing, and eminently readable on mobile.

  • n4r9 10 days ago

    > performant

    Huh. Opening this webpage on Firefox floored my laptop (8 core 16GB). The lag was several seconds, including for clicking "back" or opening a new tab.

    • n4r9 10 days ago

      Follow-up: this only seems to be the case when the "Animated Great Red Spot" image is in view.

      • e_dziewanowski 10 days ago

        May I ask what GPU do you have?

        • n4r9 10 days ago

          512MB ATI AMD Radeon Graphics

          • e_dziewanowski 10 days ago

            I'm afraid the only thing I can do in such case is to display a static image instead of a shader. Would you prefer that?

            • n4r9 9 days ago

              I was able to view it smoothly on my phone, so I'm not too fussed, but that might be a better experience for anyone else that has the same issue in future.

    • bendhoefs 9 days ago

      It opened instantly and worked smoothly in Firefox on my 8 year old android.

      • n4r9 9 days ago

        Yes, on my phone it's fine but on my laptop it's a nightmare.

  • throwaway290 10 days ago

    The use of weird non-native scrolling really hurts navigation and full justification looks clumsy when screen is narrow. But otherwisr it's not terrible.

  • enriquto 10 days ago

    The article is incredibly interesting, but the choice of colors is so low-contrast that I can only read in it "reader mode", where the animations don't work. I have resorted to "select all" where the letters stand out a bit, but it's ugly and not very ergonomic...

    • e_dziewanowski 10 days ago

      If the consensus is that the mobile color scheme is better than the desktop one I can just change it

    • amarant 10 days ago

      It's white on black, or at least white on very dark gray. Contrast is about as high as it could be on my device.

      Might there be a problem with your device?

      • ReleaseCandidat 10 days ago

        > It's white on black, or at least white on very dark gray.

        It's light grey (#666b67) on dark grey (#222623), not much contrast on desktop. Mobile uses other colours, the same background (#222623) but a lighter font color (#B2B5B3), which is _way_ better.

        Why not use the same foreground color on desktop?

keyle 10 days ago

The author seems to be experimenting in UE4 or UE5 (material graph shown in screenshot), but the examples are displayed in sharedtoy embeds?

I'm wondering, is there a direct way to save UE4 material shader to shadertoy or some easy conversion tool? Otherwise it would have taken eons to produce this page...

  • barfbagginus 10 days ago

    UE translates shader graphs to HLSL - high level shading language, see:

    https://dev.epicgames.com/community/learning/knowledge-base/...

    Shadertoy needs GLSL - open gl shading language. Luckily, UE has a HLSL -> GLSL transpiler built in:

    https://docs.unrealengine.com/4.27/en-US/ProgrammingAndScrip...

    There are other HLSL transpilers: Microsoft's ShaderConductor, Unity's hlsl2glsl, Vulkan's vcc, etc.

    To port your favorite Shadertoy examples back to UE, you can transpile GLSL to HLSL with ShaderTranspiler, glslcc, ShaderConductor, etc.

    Disclaimer: I don't use UE or Shadertoy. In fact, this is my first exposure to GLSL/HLSL. My claims may be inaccurate.

  • e_dziewanowski 10 days ago

    Website acts as my portfolio - I'm a game developer, so that is why I use Unreal material graph. Shadertoy allows me to demonstrate ideas on live example that is animated and anybody can play with its code. For the most part HLSL(Unreal) can be translated to GLSL(Shadertoy), but that wasn't the case here. In Unreal I use my own custom flow textures, in Shadertoy it is not possible - everything has to be stored in code. Even though the basic idea behind Unreal and Shadertoy shaders was the same, the implementations were quite different. It was easier to just do everything twice, that to convert it. And yes - it took a lot of work :).

  • mandarax8 10 days ago

    Looking at the final shadertoy example (https://www.shadertoy.com/view/4XSXz3) I would think he just recreated each effect in shadertoy (variable and function names dont seem exported to me).

    Most of the effects on the page are only a couple of lines it seems so maybe he did just rewrite them all? I do wonder why he bothered with UE material graphs if he's this proficient at shaders anyway.

    • davedx 10 days ago

      I can imagine using material graphs is a much better way to experiment, iterate and progressively build up the effects than hand coding a shader. It's kind of like asking why write code in C# in Visual Studio when you can just write assembly?

mikercampbell 10 days ago

I can almost feel the drops in my hair. But for real this is so cool

OCISLY 10 days ago

0.3 fps...

adzm 10 days ago

The other articles on this site are just as fascinating. What a treasure!!

Log_out_ 9 days ago

Storm lightning and aurora?

mitch7w 10 days ago

This is really cool!