Questions about PVRShaman UNIFORMs and PVRShell input events

This topic contains 4 replies, has 2 voices, and was last updated by  js1110 3 years, 9 months ago.

Viewing 5 posts - 1 through 5 (of 5 total)
  • Author
    Posts
  • #31594

    js1110
    Member

    Hello, everyone.
    I just started with PVR Insider SDK, so I have a lot to learn from you guys.
    I succeeded in importing simple 3d model with a cube, a light, and a camera exported from blender into iOS projects. Of course this project is based on PVRShell and since I don’t have any textures attached to the model, I used vertex and fragment shaders used for Basic Effect file from PVRShaman. And I had hard time to figure out the matrices used for PVRShaman UNIFORMs like WORLD, WORLDVIEWPROJECT, LIGHTDIRECTION, … .
    So what I am asking is
    1. Is there any reference indicating how to calculate those uniforms from the values in POD file? I can read POD file using PVRTools, but I don’t know the formula to calculate those uniforms.
    2. I am stuck with touch events. I mean, I don’t have any leads how to handle the touch events within PVRShell structure. So can anyone please tell me how?
    3. I see that every POD model used in PVR SDK examples use per texture files, not png format. But with blender I can only generate png texture files and export materials with png. So how can I export material textures with PVR format?
    Any help would be appreciated.
    Regards.

    #38333

    dgu
    Member

    Hello

    i am a user of the sdk and yes it s a great tools made for games.

    about the PFX the semantic is describe in the documentation and all samples given under the PVRSHaman tool,the file called OGLES2/PVRTPFXSemantics.h define it as well.

    From my knowledge there is no automatic way to get the uniform from the POD , but since it is design with material in mind , it’ really possible to set it dynamically based on material name or any convention,strategy you would like.
    the skybox 2 give a good example of it(magic lantern as well ).

    for example , accessing the material from pod is :

    m_pMaterials = new Material[m_Scene.pod.nNumMaterial];

    for (unsigned int i=0; i < m_Scene.pod.nNumMaterial; i++) {
    const SPODMaterial &material = m_Scene.pod.pMaterial;

    m_pMaterials
    .fSpecularPower = material.fMatShininess;
    m_pMaterials
    .vDiffuseColour = PVRTVec3(material.pfMatDiffuse);
    ....
    do your thing
    }

    2/

    int buttonState = PVRShellGet(prefButtonState);
    PVRTVec2 *pMousePtr = (PVRTVec2*)PVRShellGet(prefPointerLocation);
    if (0 != buttonState && NULL != pMousePtr) {
    PVRTVec2 mousePos = *pMousePtr;

    ....
    }

    you can extend the shell to be able to catch more specific OS event ,OS specific call, i believe they design it for it.

    3/PVRTexToolGUI is doing that , you can even do it by command line or build your tools that will parse the pod get the png and generate the appropriate command line ( IF iT is a POT image for instance ) .
    that is of course define well in PVRTexTool.User Manual.pdf and performance recommandation doc.

    it s a beautifull sdk !!!!

    Kind regards
    david Guimard

    #38334

    js1110
    Member

    Hello, david.
    Thank you for your comment. That really helps. But I still have questions.
    1. I know the semantics of the uniforms. But I don’t know how to calculate them. Here’s the formula for WORLDVIEWPROJECTION I used.
    PVRTMat4 mView, mProjection,mWorld,mWVP;
    ...
    mView = PVRTMat4::LookAtRH(vFrom, vTo, vUp);
    ...
    mProjection = PVRTMat4::PerspectiveFovRH(fFOV, (float)PVRShellGet(prefWidth)/(float)PVRShellGet(prefHeight), g_fCameraNear, g_fCameraFar, PVRTMat4::OGL, bRotate);
    ...
    mWorld = m_Scene.GetWorldMatrix(Node);
    mWVP = mProjection * mView * mWorld;
    I can also get the material ambient, diffuse, specular colors and material shininess. But other uniforms are mystery for me. So I was wondering if there is any reference to solve my problem. Or how do you calculate those uniforms?
    2. I can see from your code that I can get touch/mouse pointer state and locations. But i don’t know where to use this code. In render loop? So if you have done this event handling(touch events specifically) before maybe you can provide me whole shell code here or any other place so that I can refer to it. Only if you like.
    3. I have converted png files to pvr files using PVRTexToolGUI. But the problem is Blender can’t import per texture files to use for materials. So is there any way for this? I want make a POD file which have materials with pvr texture files which is obviously impossible for me with this Blender.

    I totally agree with you that it is beautiful sdk!!!

    Regards.

    #38335

    dgu
    Member

    Hello

    1/
    i may refere to existing sample then because it s all describe , up and running.
    the course on PFX introduction show how to use ECustomSemantic that is very handy feature, if you know glsl you can then pass anything you want.

    2/ you will use it in the renderscene , for more information have a look to :
    void HandleInput();
    void HandleMouseClick(PVRTVec2 pos);
    void HandleMouseDrag(PVRTVec2 dir);

    in the advanced/navigation3d samples.

    3/ i dont know blender i am using the great 3ds max….,but anyway i believe it s impossible to read a pod file from blender ( cool be a cool feature ) . so you can let your pngs as is when you export your pod and change them during the load texture operation that is done one more time in many sample given by the sdk , and specially the navigation3d demo very good one…..like that you are not bother to set it up each times.specially when you are using a lot of models and differentes textures.

    the application called scenecompiler.cpp can help ( some customization are maybe needed ).

    Yeah impossible is nothing with it.

    kind regards
    david

    #38336

    js1110
    Member

    Thank you for your help.

Viewing 5 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic.