x2go manjaro

Remote Workstation with x2go and Manjaro Linux

This post may contain affiliate links. Please see the disclaimer for more information.

I’ve been running a very under-powered and increasingly ancient laptop, a ThinkPad X131e, for several years. It’s been upgraded over time with an SSD and a replacement battery. For my requirements this is mostly OK, since my workload mostly consists of a web browser and terminal windows and it runs my preferred desktop (KDE Plasma Desktop) just fine. Since my server performs the majority of my computing I haven’t really been limited by this. However, every now and then I need to run a couple of apps which just bring it to a halt as it thrashes around trying to swap. Clearly 4GB of RAM is not enough to run modern applications.

I actually do have another desktop computer, but my use of it recently has been limited to the odd DVD rip. This was mainly due to having to switch the mouse, keyboard and monitor over from my work machine in order to use it in the limited space which is my home office. I could have probably remedied this with a KVM switch, but somehow I never got around to it. I was also not enthused about spending my evenings sitting in the same office I’ve just been working in all day. So that computer sat gathering dust for the most part.

That was until I came across a post about using x2go to create a remote workstation after musing whether this would be possible. I decided to do the same, since I had the hardware already available to make it work.

A (Semi) Failed Experiment

Initially I did a few tests with local VMs to see if x2go was going to work with KDE, since I’d seen mixed reports about this. The good news is that it works pretty well (at least for basic remote desktop, I’ll come to some of the problems below). The bad news is that my preferred desktop distro – KDE Neon – didn’t work well. First of all I couldn’t install the client on my laptop due to a dependency issue in APT. Secondly, although the desktop worked fine I was unable to suspend the session due to some Systemd/D-BUS issue. So I tried another distro. I’d heard good things about Manjaro and the KDE edition works great with x2go (both client and server). It also has a really nice default theme!

I also wanted the remote desktop to run as a VM on the machine, under Proxmox, so that I could potentially switch distros easily or create extra VMs for other purposes. I spent quite a bit of time configuring this only to find a few issues. The first was that I couldn’t get the host machine to pass through the internal DVD drive to the VM which was a deal breaker. The second was that suspending the VM and shutting down the host was pretty clunky and prone to just hang for no reason. As I added this machine in a cluster with my other host, the cluster would also lose quorum when the host went down. This causes lots of things to fail, including VM backups on the remaining host.

Back to Bare Metal

I decided to abandon the VM approach for now and go with a bare metal install to see if I can work with the remote desktop system. It wasn’t a complete loss, since I got a chance to try out the newly updated clustering in Proxmox, which will be relevant when I convert my existing Ubuntu server over.

The bare metal install of Manjaro was pretty boring (which is a good thing, installing a Linux distro should be boring and stable!). One thing I noticed was that I wasn’t able to manually set up LVM from the GUI installer. I could create volume groups, but it wouldn’t let me add partitions to them! As far as I understand Manjaro Architect lets you do this. I played with this a bit later when installing Manjaro on my Laptop, but opted to go with the default encrypted system option from the GUI installer. If I ever come to reinstall on the desktop machine, I’ll look at Manjaro Architect further.

Saving Power

Since this machine won’t be in use all the time, I wanted to shut it down and power it up remotely. I also wanted my x2go session to be persistent so I could pick up where I left off. For this reason I opted to use hibernate coupled with wake-on-LAN.

Configuring the wake-on-LAN took a little while. Even after enabling it in the BIOS and configuring network manager, it still didn’t work. It turned out that it was being disabled by TLP. After fixing that it worked fine.

The next problem was that I wanted to power the system up and down via a switch in Home Assistant. This was difficult as the Home Automation system is on a different subnet to the machine in question. I opted in the end to run the WOL command from my pfSense firewall over SSH. The following HASS configuration gave me the switch I was looking for:

shell_command:
  desktop_power_on: "ssh -i /config/id_rsa -o 'StrictHostKeyChecking=no' home-assistant@<my_firewall> -- '-i <subnet broadcast address> <mac address>'"
  desktop_power_off: "ssh -i /config/id_rsa -o 'StrictHostKeyChecking=no' <user>@<my_desktop>"

binary_sensor:
  - platform: ping
    name: "Desktop Computer State"
    host: <desktop ip>
    count: 1
    scan_interval: 5

switch:
  - platform: template
    switches:
      desktop_computer_power:
        value_template: "{{ is_state('binary_sensor.desktop_computer_state', 'on') }}"
        turn_on:
          service: shell_command.desktop_power_on
        turn_off:
          service: shell_command.desktop_power_off

This requires a bit of setup on both the firewall and the desktop machines. First I created a user in pfSense with the relevant permissions to log in via SSH. I then added the following to the authorised SSH keys field:

command="wol $SSH_ORIGINAL_COMMAND",no-port-forwarding,no-x11-forwarding,no-agent-forwarding ssh-rsa .....

This basically allows the SSH key to only run the wol command and to pass through the original command as arguments to it. You’ll note that in the power on command above, only the arguments are specified. This means that I can send WOL packets to any machine on the network, but the key can’t do anything else.

There is a similar bit of configuration on the desktop machine:

command="systemctl hibernate",no-port-forwarding,no-x11-forwarding,no-agent-forwarding ssh-rsa ....

This restricts the key to only running the systemctl hibernate command.

With that in place I have a nice switch in my HASS GUI to power up and down the machine. I can also automate it to power up and down under certain conditions if I wish.

x2go manjaro
The resulting power switch in HASS

Setting Up x2go

Since Manjaro is based on Arch Linux, I just installed the x2goserver package on my desktop and the x2goclient package on my laptop.

x2go is fairly trivial to set up, requiring only the enabling of X11 forwarding in the SSH daemon on the server side, just follow the instructions to do this.

x2go manjaro
My x2go Session Preferences

In order to connect to a KDE desktop running on the server, we need to set up a profile in the x2go client. The main thing here is to set the session type to “Custom desktop” with the command startplasma-x11 (the KDE session type doesn’t work for some reason). Obviously you also need to set the address of the machine to log in to. I also found that I needed to set the path to my SSH key in order to have it be used by the client.

The Finished Product

The final product is pretty nice. I can remotely boot my desktop machine from my laptop or phone. I then connect with x2go from my laptop and get to work! My previous session will be restored if there is one, meaning I can just pick up where I left off. The connection to the remote system is excellent, with no noticeable delay. So far I’ve only tested this over my local wifi, but that will be 97% of it’s usage anyway. There is some tearing when moving windows and some graphical issues when the x2go session window gets resized, but these resolve themselves after a few seconds.

x2go manjaro
My full remote desktop in all it’s glory!

One wrinkle is shared folders, which are supposed to be supported by x2go, but currently seem to be broken. I get an error message, similar to that described here when I try to mount one. Apparently that bug is fixed, but I guess the version containing the fix hasn’t been released yet. If the new version doesn’t land soon I’ll probably try and work around it with SSHFS+some scripting. For now it’s not too much of an issue, all my files are now on the desktop anyway!

I also don’t see any sound devices on the remote system, but I haven’t tried playing sound to see if it’s working. I tend not to listen to music or watch movies on my laptop anyway and I can still do this locally if need be.

Conclusion

Overall, I’m pretty happy with this setup. It’s not perfect, but it is nice to have easier access to my other machine. I still have a lot of setup to do at both ends to make this work better and to make the remote machine feel like home, but it’s getting there.

I’m not sure if I’ll run this setup full time yet or just for certain applications. I’ve decided to upgrade the RAM in both systems as well as my server, so we’ll see how the balance of local vs remote falls out after doing that! I’m also eyeing a Pinebook Pro as a potential replacement laptop.

I really like Manjaro as a distro. It’s not quite as polished a KDE experience as KDE Neon is, but it’s getting there. I like the Arch base – I’ve been a fan of Arch since my very first experience with it (CAUTION: very old post!), but I just couldn’t get on with it as a main driver. Manjaro gives you a nice experience out of the box with the power to tweak as much as you like.

I’ll be sure to update on any further progress with fixing some of the issues I’ve encountered with this project. In the meantime, if anyone has solutions/workarounds to the shared folder issue, please leave a comment. Bye for now!

If you liked this post and want to see more, please consider subscribing to the mailing list (below) or the RSS feed. You can also follow me on Twitter. If you want to show your appreciation, feel free to buy me a coffee.

UALUG Fedora Article

Ahoy’hoy!

Anyone that follows my Identi.ca feed will probably be aware that I recently switched away from Ubuntu and Archlinux and over to Fedora. This was mainly due to frustrations with Arch (I need a system which lets me get stuff done without too much overhead in looking after it) and my increasing feeling that Ubuntu isn’t going in the right direction for me as a power user/developer.

I recently wrote an article on Fedora 13 for our University LUG – UALUG and I thought I’d post a link for readers of this blog. The article focuses on Fedora 13 as a platform for developers and basically details my own Fedora 13 setup. It’s written with the aim of advocating Fedora for people new to Linux, but it also serves of my review of the latest Fedora release.

Anyway here’s the link: https://ualug.ece.auckland.ac.nz/archives/246.

Hopefully I should be posting some more interesting content here soon as I’ve been playing around with some interesting stuff. It’s just a matter of me finding time to write it up! Bye for now.

Installing and Configuring Arch Linux: Part 1

OTHERWISE ENTILED: Rob tries to install Arch Linux some of the time, but really spends most of the time drinking beer.

Before I start: NO, UNLIKE EVERY OTHER ARTICLE ON THE WEB, PUBLISHED TODAY, THIS IS NOT A JOKE, K?!?

I’ve been looking for a new distro recently. I do this from time to time, principally because I get bored of what I’m currently running. Last time it was Crunchbang which I settled on. This time I wanted to go more advanced, so I started researching Arch Linux.

For those that don’t know, Arch Linux describes itself as:

…a lightweight and flexible Linux® distribution that tries to Keep It Simple.

I’d heard about Arch in the past from several sources and had heard that you basically have to install and configure everything yourself, but that the package manager (awesomely named Pacman!) manages software without having to compile from source (unless you want to!).

The following series of posts will be a record of my experiences installing and configuring Arch on my home desktop machine. This isn’t intended to be an exhaustive installation guide, more just a record of where I tripped up in order to aid those who come next. If you are searching for an installation guide, try the excellent article on the Arch Wiki.

I’ve separated the post out into days. Note: it didn’t actually take me a full day for each part, I work during the day and only really had a couple of hours each evening to spend on this.

Day 1: Backing Up

Before installing I wanted to make sure I didn’t trash my existing Ubuntu system and all my personal data, as I still need to do all the stuff I usually do with my machine. So I made a backup.

I’m not really going to go into how. Suffice to say I used LVM snapshots and rsync, I might write about this in a future post.

This took a while, as I have quite a lot of data. I thought it best to have a beer in the mean time, so I did.

Day 2: Making Space, Starting the Installation and Various Adventures with LVM

The next thing to do was to resize my existing LVM partition containing Ubuntu so that I had space for Arch. I couldn’t work out how to do this at first as none of the partition tools I tried (GParted and Cfdisk) could resize the partition. I eventually worked out how to do it.

First, on my running Ubuntu system I resized the physical volume with:

$ pvresize --setphysicalvolumesize 500G /dev/sda1

This shrank the space used by LVM down to 500GB (from about 1000GB on my machine).

I then rebooted into the Arch live CD (64-bit edition in my case), and ran:

$ fdisk /dev/sda

Now what you have to do next is slightly alarming. You actually have to delete the partition and recreate it in the new size. This works, without destroying your data, because fdisk only manipulates the partition table on the disk, it doesn’t do any formatting of partitions, etc.

I did this through fdisk so that the partition was 501GB (making it a little bigger than the PV just to make sure). I then rebooted back into Ubuntu and ran:

$ pvresize /dev/sda1

To allow it to use all the space. This probably isn’t necessary but I wanted to be safe.

Next, I proceeded to the installation. For some reason the Arch boot CD was really slow to boot and gave me loads of read errors, I think this might have something to do with my drive as I’ve been experiencing the same with other disks. Eventually it booted and dropped my at the default prompt.

From then I basically followed the installation guide for setting up the package source (CD) and the date and time.

I then set about partitioning the disks. The Arch installer uses Cfdisk, which is fine. I just added two partitions to my disk, a small (255Meg) one for my /boot partition and a large LVM one for the rest of the system (I like LVM and wanted to use it again on Arch).

This was fine, but I had some problems setting up the LVM through the installer, even though the user guide seems to think it can do it. Every time I tried, it would just fail on creating the Volume Group, weird.

I gave up for the evening and (you guessed it) went for a beer!

Day 3: Successful Installation

The next day I thought I’d try googling for LVM on Arch, luckily when I got in to work @duffkitty on identi.ca had seen one of my posts complaining about having problems and had given me a link to the LVM article on the Arch Wiki.

This advocated setting up the whole LVM setup manually (and guides you through it) and then just setting the partitions to use in the installer. It also gives you some important things to look out for when configuring the system. Following these instructions worked like a charm and I was able to format everything correctly and install the base system.

I then moved on to configuring the system, following the install guide and taking into account the instructions in the LVM article. Everything went pretty much fine here and I eventually got to installing the bootloader. Here I replaced the Ubuntu Grub version with the one installed by Arch. This left me having to add an entry for Ubuntu, which wasn’t difficult, I just copied the Arch one and changed the partition and file names.

Then it was time to ‘type reboot and pray’ as the Arch installation guide puts it.

So I did.

When I rebooted the bootloader came up with the Arch and Ubuntu entries. I selected Ubuntu just to check everything was OK.

It didn’t work.

Panicking and Swearing Ensued.

I rebooted and selected Arch.

That worked (thankfully).

When it had booted I logged in and opened up the Grub config file again. it turned out I mis-typed the name of the Ubuntu initrd file, that was easily fixed. Rebooting got me safely back to Ubuntu.

So now I have a functioning dual boot between my original Ubuntu install and a very basic Arch install, I think I might need some software there!

But first… beer.

So What’s Next???

Well, firstly I need to get my network connection up and running as I didn’t do that during the install. It’s a Wifi connection over WPA so that’s going to be fun. Then I can start installing software. I’ll probably follow the Beginners Guide on the Wiki (from Part 3). I was also recommended Yaourt by @duffkitty, so I’ll give that a try.

I’ll be continuing to play with Arch over the next few days and reporting my progress in follow up posts here. I’ll also be denting as I go along and you can follow all of these on my #archrob hash tag.

There’ll probably be beer too.

We’ll see how it goes, but eventually I hope to have a system I can use full time.

Bye for now! Happy Easter!