Return to Level1Techs.com

[SOLVED] lgmpClientSessionInit Failed

Hello, all.

I’ve got Looking Glass B4 server and client on an Arch host and a freshly installed W10 guest. I’m passing my 1080 Ti through while the host uses my RX480. The lot of it seems more or less fine, except for Looking Glass. When I run the client well after Windows boots, I get the following:

[[email protected] ~]$ looking-glass-client 
[I]  17365369829              main.c:1064 | main                           | Looking Glass (B4+)
[I]  17365369857              main.c:1065 | main                           | Locking Method: Atomic
[I]  17365371934           ivshmem.c:127  | ivshmemOpenDev                 | KVMFR Device     : /dev/shm/looking-glass
[I]  17365466435               egl.c:274  | egl_initialize                 | Double buffering is off
[I]  17365466453              main.c:671  | tryRenderer                    | Using Renderer: EGL
[W]  17365466712              idle.c:31   | waylandIdleInit                | zwp_idle_inhibit_manager_v1 not exported by compositor, will not be able to suppress idle states
[I]  17365467125                gl.c:57   | waylandGetEGLDisplay           | Using eglGetPlatformDisplay
[I]  17365526282               egl.c:685  | egl_render_startup             | Single buffer mode
[I]  17365532711               egl.c:701  | egl_render_startup             | EGL       : 1.5
[I]  17365532723               egl.c:702  | egl_render_startup             | Vendor    : AMD
[I]  17365532732               egl.c:703  | egl_render_startup             | Renderer  : AMD Radeon (TM) RX 480 Graphics (POLARIS10, DRM 3.41.0, 5.13.13-arch1-1, LLVM 12.0.1)
[I]  17365532737               egl.c:704  | egl_render_startup             | Version   : OpenGL ES 3.2 Mesa 21.2.1
[I]  17365532741               egl.c:705  | egl_render_startup             | EGL APIs  : OpenGL OpenGL_ES 
[I]  17365532746               egl.c:706  | egl_render_startup             | Extensions: EGL_ANDROID_blob_cache EGL_ANDROID_native_fence_sync EGL_EXT_buffer_age EGL_EXT_create_context_robustness EGL_EXT_image_dma_buf_import EGL_EXT_image_dma_buf_import_modifiers EGL_EXT_swap_buffers_with_damage EGL_KHR_cl_event2 EGL_KHR_config_attribs EGL_KHR_create_context EGL_KHR_create_context_no_error EGL_KHR_fence_sync EGL_KHR_get_all_proc_addresses EGL_KHR_gl_colorspace EGL_KHR_gl_renderbuffer_image EGL_KHR_gl_texture_2D_image EGL_KHR_gl_texture_3D_image EGL_KHR_gl_texture_cubemap_image EGL_KHR_image_base EGL_KHR_no_config_context EGL_KHR_reusable_sync EGL_KHR_surfaceless_context EGL_KHR_swap_buffers_with_damage EGL_EXT_pixel_format_float EGL_KHR_wait_sync EGL_MESA_configless_context EGL_MESA_drm_image EGL_MESA_image_dma_buf_export EGL_MESA_query_driver EGL_WL_bind_wayland_display EGL_WL_create_wayland_buffer_from_image 
[I]  17365545715                gl.c:83   | waylandEGLSwapBuffers          | Using EGL_KHR_swap_buffers_with_damage
[E]  17365555831              main.c:891  | lg_run                         | lgmpClientSessionInit Failed: LGMP_ERR_INVALID_VERSION

I know libgmp is for math to arbitrary precision, but perhaps that’s not even the library being used here? (After all, lg_run is Looking Glass Run.) Other than that, I’m clueless as to what’s amiss.

Any pointers would be greatly appreciated. =)

Further investigation indicates that LGMP is the Looking Glass Memory Protocol, and that this error is probably from LGMP/lgmp/src/client.c, or:

LGMP_STATUS lgmpClientSessionInit(PLGMPClient client, uint32_t * udataSize,     
    uint8_t ** udata)                                                           
{                                                                               
  assert(client);                                                               
  struct LGMPHeader * header = client->header;                                  
                                                                                
  if (header->magic != LGMP_PROTOCOL_MAGIC)                                     
    return LGMP_ERR_INVALID_MAGIC;                                              
                                                                                
  if (header->version != LGMP_PROTOCOL_VERSION)                                 
    return LGMP_ERR_INVALID_VERSION;  
...

That’s downright interesting… Perhaps my shared memory segment persisted from when I mistakenly compiled the bleeding edge client and tried running it against B4’s host on Windows?

Guess I’ll look around at how to “redo” the shared memory segment. :wink:

Nnnnope!

I’m just a touch foolish and didn’t heed the warning not to clone from GitHub directly. I grabbed the source tarball from the website and everything built and ran just fine.

Nothing to see here, folks. =)

2 Likes