Q3D V-2.4 [3D Physics + Skeletal Animation UPDATE]

From the Asset Store
Fantasy Game includes more than 600 sound effects inspired by hit computer games like World of Warcraft and Diablo.
  • Sadly, the exporter doesn't support exporting multiple animations yet. (And I have a feeling that it never will. For some reason nobody is willing to work in the code anymore. I would if I was a coder.)

    You have to export a separate json file for each of the animations your character has, and then combine them manually in a text editor like notepad++. It's pretty self-explanatory if you just take a look at the model files from some of the Q3D example files, like the Quake-style mercenary guy.

    That's what I thought. I tried doing just that and thought I was doing everything right until I hit a javascript error when trying to run it. I guess I'll keep trying, thanks for the help!

  • Yea, the three.js exporters are pretty bad with regards to animations, but they're all thats available at the moment. If i knew how to write one I would, but writing exporters isn't something i'm familiar with sadly.

    The community developing three.js have been making some big changes to how animation works with the API, so hopefully a better exporter will come along eventually. Three.js has moved along from when I last updated Q3D (development on three.js kind of stagnated at V-71 which Q3D currently uses) so hopefully when I get some free time I can update the plugin a little bit to fix some of the bugs which v-71 / chrome have.

    You should be able to just copy paste things like the quake guys, but obviously you need to be careful with the syntax.

  • Making a kind of "elite" inspired wireframe space exploration game. Having trouble with rotating the ship then traveling in that new direction. I guess its getting that curving trajectory when moving.

    Feel free to help

  • Try Construct 3

    Develop games in your browser. Powerful, performant & highly capable.

    Try Now Construct 3 users don't see these ads
  • railslave

    there are expressions to get the normalized world space components of the "local" directions (i.e. the components of the rotated xyz frame defined by the axis helper) for exactly this kind of thing. you can also just use the "translate (local space)" action.

  • I'm going to be working on the plugin soon so if anyone has bugs to report that'd be great!

  • I'm going to be working on the plugin soon

    Oh man, that's great news.

    The way animations are handled by events could use some restructuring and simplification, no? Lot's of people are having trouble getting animations to work.

    Maybe we could get shadows to recognize alpha values of a texture, so that, for instance, patches of leaves on a tree don't just cast quad-shaped shadows.

    Oh and perhaps there's someone somewhere willing to make the necessary improvements to the Blender exporter script. Multi action export would increase the value of Q3D tenfold. Maybe you, being the plug-in's creator, have better chances of getting someone to do this than any of us.

  • Animations are pretty straightforward with events, just no good examples uploaded. I can't really improve them! I'll have to look into the alpha shadow stuff, It's a three.js issue though with how alpha test is implemented if i recall.

  • railslave

    there are expressions to get the normalized world space components of the "local" directions (i.e. the components of the rotated xyz frame defined by the axis helper) for exactly this kind of thing. you can also just use the "translate (local space)" action.

    Thanks.

    One more thing, any idea how to increase the draw distance, massively ? . I tried to mess about with the fogging but to no avail.

  • railslave

    adjust the camera far clipping plane. (note that you'll get a lower quality Zbuffer though)

  • railslave

    adjust the camera far clipping plane. (note that you'll get a lower quality Zbuffer though)

    Ah Thankyou, thats brilliant, as found is frustum settings

    Gamechanging plugin ..

    Had a hard time keeping up with unity sub, no longer

    Hindu lore, 1980's version of No mans sky here we come

  • Although parenting to the camera seems like it should be a good solution for a UI, it allows objects to "penetrate" the UI which will look weird. Optimally you can use 2 Q3D Viewports, and create an extra scene with a fixed camera and your UI objects with a 0 alpha background, and layer that above a view-port for the main games scene. If you really need to parent objects to the camera just make an dummy object which sets angle/positon to the cameras x/y/z position and rotation, but again for a UI this isn't the best option.

    If you use a Q3D Viewport, You can make the view-port small to make it render a smaller area if the UI is only something like a bar at the top, it doesn't have to encompass the entire area. For 2D ui's it's best to just use the construct 2 canvas as a "layer" itself with behind mode set on Q3D Master.

    Using extra scenes is a bit weird but the main things you have to know is:

    1. By default, everything is created in the scene named "Default", and this scene is picked.

    2. To use another scene, you need to first create a scene with a unique name like "MyScene", similar to how multiple cameras work.

    3. To add objects / change properties / do anything really with a specific scene, you need to pick it with the "Pick scene" action in Q3D Master, similar to how multiple cameras work.

    4. Once the scene is picked, any actions that have anything to do with a scene will affect the "picked" scene. e.g. change background, parent an object to a scene, create an object, change fog, etc. the only exception are cameras which exist independently of any scene (they're global to all scenes, so they don't exist in any one scene).

    5. To move objects between scenes you need to use the action that parents them to the scene in the hierarchy section. so you need to pick the scene you want to move them to first.

    6. To affect the default scene again, just re-pick it with the "pick scene" action, the name it uses is "Default".

    7. To rendering multiple scenes is as easy as writing the scene name / camera name in the viewport properties and layering them as you please. Semi-transparency won't render properly between viewports though due to renderer design.

    I cannot for the life of me get this to work. Has anyone here stacked 2 viewports on top of each other like in the above quote? If so, I sure would love it if you'd share a capx to keel my frustration..

    I'm going to be working on the plugin soon so if anyone has bugs to report that'd be great!

    Really cool news!

  • purplemonkey , I have managed to get two viewports working, but the biggest issue I had was that the second camera that you make for it has to have its settings set specifically, since the default camera's settings aren't set the same for new cameras you make.

    Quazi, It would be good if new cameras had the same default settings as the default camera..

    I have fixed various things on my own, like flipping/mirroring sprites, adding alphatest, and making shaders work.. so if those get fixed in the official plugin, I won't have to reapply my fixes if there's an update.

    There's also the bug where if the q3d master is a global object, model's offset parameter get added whenever layout changes, so their offsets become messed up.

  • purplemonkey , I have managed to get two viewports working, but the biggest issue I had was that the second camera that you make for it has to have its settings set specifically, since the default camera's settings aren't set the same for new cameras you make.

    Quazi, It would be good if new cameras had the same default settings as the default camera..

    I have fixed various things on my own, like flipping/mirroring sprites, adding alphatest, and making shaders work.. so if those get fixed in the official plugin, I won't have to reapply my fixes if there's an update.

    There's also the bug where if the q3d master is a global object, model's offset parameter get added whenever layout changes, so their offsets become messed up.

    Thanks for the input. After a lot of ifs and buts I finally managed to get it to work! Phew, I'm going to have a drink!

  • Prominent

    Noted, thanks for the concise list! It's been a while since I've worked on things and those quirks are things i need to get fixed for sure. As i mentioned im going to be moving to three.js r-73, so this will probably introduce new bugs (three.js is notorious for breaking changed between versions which makes it a pain to update), but it's worth it IMO. With the new version shadows will actually work properly with multiple lightsources, point lights can support shadows, and shadows will work properly with viewports/scissor test.

    I'm curious as to how you enabled alpha test. From my testing it had issues because it required shader rebuilds, and this introduced a lot of slowdown in the object pooling/recycling system I have implemented to keep things fast, this is the main reason I had it disabled. Would you accept if I left it fixed at something like 0.5 ? I haven't actually used it extensively so i'm not sure of it's main advantages, from what i understand it's to cut-down on fillrate but doesnt allow shadows to "see" through rejected pixels unless you use a custom depth material. The main issue is allowing people to change it through an action would heavily break optimization on objects which get it set. I'm not sure of how to go about fixing this, perhaps a behaviour to flag the object type for alphatest control? I don't want it to affect the optimizations on objects that don't need it and the SDK doesn't really offer flexible solutions for this.

    shaders work, they're just not documented because I did plan to change the system they use, and I think as it stands you can't include stuff like lights easily.

  • Are 2d sprite HUD's do-able ? Tried a few ways

Jump to:
Active Users
There are 1 visitors browsing this topic (0 users and 1 guests)