Jump to content

YoGames Studios

Members
  • Posts

    46
  • Joined

  • Last visited

  • Days Won

    6
  • Posts

    46
  • Joined

  • Last visited

  • Days Won

    6

Everything posted by YoGames Studios

  1. My system runs on an Intel HD Graphics 520. Also, the setup needed for this is a litle hilarious.
  2. Well, about 3 monthes ago I posted a tutorial about sphere reflections in wich I used external software and claimed that it would not be possible to get the same effect just within MI. But well, it seames that I found a way to do that, event though some fine tuning still needs to be done and I still have to get creative to solve the problem with the bottom and top.
  3. I am 99% sure that (with the current MI-version) it is not possible to make it adaptible. Actually, I knew this method for over a year and I actually was experimenting, but gave up on it. Programing wise it should not be to difficult to implemet a feature that makes accurate reflections within MI (since it is just some mathe that has to be done). But, in order to make it useable with videos, I will make a program that automaticaly makes the key-frames for changing the texture every singel frame, wich will drasticicaly improve the workflow with animated textures and baked refelctions.
  4. So, here it is. Another tutorial about reflections in Mine-imator. Hope you enjoy. Blend-file: http://www.mediafire.com/file/j62rrffbvbii7mb/EquiConverter.blend1/file 360 Toolkit: https://360toolkit.co/convert-cubemap-to-spherical-equirectangular
  5. There is no specific meaning to the scene, I just needed some "specular highlights" and general lighting in the scene to test it.
  6. It sometimes does not show it, and i have to edit it later. It should be there now.
  7. Well, it took me some time to get motivated again, but here it is, another Showcase Short dealing with relfections. But this time it is not as accurate as I would like it to be.
  8. I'm by far not good enough to be a Mine-imator devolper, the tools I make are just realy basic Java application (and java isn't used in MI, and is the only language I partially know). Also I don't fulfill some of the requirements for the developers (e.g. I'm not 18 yet). Maybe I'll learn the other required languages till I turn 18, in this case I would be glad to become one of the MI-developers, but since I'm currently 15 this will at the earliest happen in 2.5 years.
  9. Actually, that is what I was working on, but since Mine-imator uses it's fancy local roation, it looks very silly when you import it to MI. Later I realised that it still can be used for just location translation and found this as an example. Maybe ther is a way to convert the global-space rotation to MIs roation with some math, but currently I don't know how this could work.
  10. I use an amature in blender and export the animation to a motion capture file (.bvh). The actual tool is a realy simple 300 lines java program that only converts the .bvh to a .mimodel and .miframes file, so that you than can import this to Mine-imator.
  11. This is a test for a physic translation tool that is supposed to seamlessly translate a baked rigid body animation from Blender to Mine-imator, but it currently only works with not textured spheres (because you can't tell that they don't rotate).
  12. I see, this of cause will make the reflection cleaner. Also: Instead of adding the last layer you can also try to screen it, wich could work better some times (it prevents overexposure). Did you try to restart MI? I also had some situations in wich the image was not in sync anymore, but after restarting, I was able to coposite up to 4 compositions (from other cameras) together.
  13. I tried to replicate the effect so that it works in an animation and it worked quite well. I don't know how you did it but I see no reson for another layer. Also, I think in your case you don't have to make a copy of your scene. You should be able to just mirror the camera movement, since it doesn't seam like anything gets in the way of the camera, or do I miss something? How i did it: I'm not that much of an artist though. only 1 scenerie
  14. The fog would have been my second quess. But this mirror update issue should disapear after restarting MI.
  15. Does it use a gradient textue in the mask to get the fresnel?
  16. Yes, it should be possible, you just have to make shure that the camera motion is mirrored along the reflective surface. As long as all cameras are doing the same and the mirrorCam is mirrored properly you can also move and rotate the reflective surface during animations. That's why I always say to put the things in a folder. Here is the image.
  17. Yes, but only binary. You culd make a "roughness-map" that makes parts of the reflection sharp again, but you can only chose between 100% sharp or 100% shoft. If you wait a few minuits I can upload a render.
  18. This is the 3. tutorial of my MI realtime masking tutorial series, the topic is Perspecteively Correct Reflections. WARNING: since my english is not the best some of the explanations are tough to understand, so if you have any qustions just ask, but watch the full video before you do so. I use MI-version 1.2.4 in this tutorial.
  19. This is the 2. tutorial of my MI realtime masking tutorial series, the topic is Perspecteively Correct Portals. WARNING: since my english is not the best some of the explanations are tough to understand, so if you have any qustions just ask, but watch the full video before you do so. I use MI-version 1.2.4 in this tutorial.
  20. This is the 1. tutorial of the MI realtime masking tutorial series I promised to make, the topic is Adaptive Volumetric Light. WARNING: since my english is not the best some of the explanations are tough to understand, so if you have any qustions just ask, but watch the full video before you do so. I use MI-version 1.2.4 in this tutorial.
  21. I will make a tutorial series about realtime masking in Mine-imator. It will cover perspectively correct volumetric light, portals, non euclidean worlds, reflections/mirrors and other stuff you can do with realtime masking. This "perspectively correct" NEW (non euklidiean world) also uses this realtime masking technique, which uses a quite complicated multi-camera setup in combination with the new blend modes, which for me is to complicated to explain in text. Also these are technical showcases and not 100% developed yet, that's why I post this kind of stuff to the random tests and don't explain them yet because some things still could change.
  22. Aha ok, that's what you ment with keyframe play, now i got it. But for bigger or more komplex scenes this could get a lot of work.
  23. I know that it's possible to get the effect with the alpha glitch, but it wont be perspectiely correct when you place something in front of it and view from the other site.
  • Recently Browsing   0 members

    No registered users viewing this page.

  • Create New...