Return to

The small linux problem thread



You should be able to open your terminal program and type steam.
Hit enter and it should show you what it’s doing while trying to run.


tar: This does not look like a tar archive
xz: (stdin): File format not recognized
tar: Child returned status 1
tar: Error is not recoverable: exiting now
find: ‘/home/isaac/.steam/ubuntu12_32/steam-runtime’: No such file or directory


I was able to find this:

Looks like the solution is to create a missing directory with
mkdir /home/isaac/.steam/ubuntu12_32/steam-runtime
Go ahead and put that in your terminal.
The mkdir command is used to create a directory. And we’re creating the missing directory from the error message:


that’s so dumb and easy that of course i didn’t think of it… Worked beautifully lol.


Happy gaming :slight_smile:


Since I can’t seem to get an answer from google or DDG figure I would ask it here, Does the WX 5100 work with the AMDGPU non-PRO drivers? Going to put my RX 560 into a PC and give it to my youngest nephew and was thinking of replacing it with a WX 5100 and leave any heavy lifting to the RX 580, just wondering if I can get away with the open drivers or if I’m going to be doing some dual driver magic when I get to that point.


Dis help?

Hashtag product support


Haven’t done this myself but can’t see why it shouldn’t.

Calling @oO.o as i see he owns one and should have more experience.


Yup, that helps. Still a few months from having to worry about it though. Everything I was finding was about installing the pro driver or was dated 2016 and usually things change in 3 years. Looks like I just need to build the kernel with modules for opencl and vulkan and have libdrm_amdgpu, and a few other things built in. Now to read a bunch of kernel documentation now that I know what to be looking for…


Been working great since about kernel 4.17. Definitely had some issues around the time AMD was making massive edits to its Linux drivers, but haven’t had any issue I can think of since then.

You might already know this, but It is a pretty measly GPU. In my case, I just wanted something that could drive productivity tasks on 4k monitors which it’s great for, but otherwise, it chokes on pretty much anything more intense than Factorio.

I’ve never used any pro drivers with it.


I cleaned it up a bit and added set -euo pipefail. It should be more readable despite my continued refusal to use if/else (or loops for the most part).


Yeah, I just want it to drive 3 displays that are not my main display and use a 580 for more intensive tasks or games.


I didn’t sell hard enough on signals :slight_smile:

They are the only form of exceptions that make sense in shell land, since they travel across processes. My shell does have actual exceptions, but they are limited to scopes within the shell process, not sub-processes.

function catch_exception() {
  echo exception caught, exiting script 1>&2
  exit 1

function throw_exception() {
  kill 0 

trap "catch_exception" SIGTERM

function list_files() {
  if ! ls -l $1 ; then
    echo error listing files in $1 1>&2


set -o pipefail does this, unless I’m misunderstanding you. If any commands exits non-zero, the whole pipe will fail.

Also, I don’t understand how I could actually implement your solution of wrapping commands in functions since doing so precludes the ability to pipe at all (can’t pipe to a function).


Yeah, set -o pipefail does work. Pipes do work through functions, bash will fork itself and run the function, reading input and writing output just like any other program. You can even put functions in the background:

function delay () {
  sleep 1
  echo wakey wakey

function number_lines() {
  local cnt=1
  while read ln ; do
    echo "${cnt}. ${ln}"
    cnt=$(( $cnt + 1 ))

delay &
cat $1 | number_lines 

The function above numbers lines while a function sleeps in the background. Shells have amazing powers.

$ |
1. 1. #!/bin/bash
2. 2.
3. 3. function delay () {
4. 4. sleep 1
5. 5. echo wakey wakey
6. 6. }
7. 7.
8. 8. function number_lines() {
9. 9. local cnt=1
10. 10. while read ln ; do
11. 11. echo "${cnt}. ${ln}"
12. 12. cnt=$(( $cnt + 1 ))
13. 13. done
14. 14. }
15. 15.
16. 16. delay &
17. 17. cat $1 | number_lines
18. 18. wait
19. wakey wakey
wakey wakey


There are some weird behaviors when using functions as processes. Your whole environment is exported to the sub-process, so modifying globals are not reflected in the main script.



function delay () {
  sleep 1
  echo wakey wakey

function number_lines() {
  local cnt=1
  while read ln ; do
    cnt=$(( $cnt + 1 ))
  echo $cnt

delay &
echo "one = " $myvar
cat $1 | number_lines
echo "two = " $myvar
echo "three = " $myvar

Myvar is set in the functions but these are only reflected in the subshell.

$ |                                            
one =  main
wakey wakey
two =  main
three =  main

I think an interesting shell design would map the environment in shared memory so that changes are reflected as it appears in the script. Powershell does have features like this, where you can open the scope of an object and set some variable.


Ok yeah, you can use read in a function to capture a pipe but you risk hanging the whole process if it doesn’t receive any stdin, so I’d avoid it.

I also don’t think manipulating global variables directly in a function is good design. You typically want the scope of a function to be limited to stdout, stderr and return values. It should be compartmentalized.


Ok, cool. I’m selling snake oil anyway. C is the one true language :slight_smile:


I definitely agree that my 500+ line bash script may be a candidate for a “real programming language”, but since it is essentially still just a wrapper for rclone, I think it’s ok to keep it in bash…

Despite its length, the usual symptoms of a bloated bash script are not present (nested loops, arrays, lots of functions, etc…). And it should run on pretty much anything with bash 4+ (even macOS after upgrading bash with homebrew), which is the reason I went with bash in the first place. rclone is very portable and I wanted my script to be as well.


Dash to panel is making my desktop freeze everytime I add an application to the favorites bar

can’t figure out what the issue is tbh

was on fedora 28 :sweat_smile: