(shadow 'char-width) (use-package :xlib) (defparameter *mods* '(:mod-1)) (defparameter *move* 1) (defparameter *resize* 3) (defparameter *lower* 4) (defparameter *raise* 5) (defparameter *display* nil) ; set this to an integer to do testing with xnest
(defun open-default-display (&optional display-name) "Open a connection to DISPLAY-NAME if supplied, or to the appropriate default display as given by GET-DEFAULT-DISPLAY otherwise.
OPEN-DISPLAY-NAME always attempts to do display authorization. The hostname is resolved to an address, then authorization data for the (protocol, host-address, displaynumber) triple is looked up in the file given by AUTHORITY_PATHNAME (typically $HOME/.Xauthority). If the protocol is :local, or if the hostname resolves to the local host, authority data for the local machine's actual hostname - as returned by gethostname(3) - is used instead." (destructuring-bind (host display screen protocol) (get-default-display display-name) (declare (ignore screen)) (open-display host :display display :protocol protocol)))
>>1 Looks cool, but X sucks. Fortunately, progress in recent years has almost made X no longer mandatory, except for the fact that AMD, nVidia, and Intel are stuck in the stone age with their binary blob crap that still requires X and would probably cost them each a couple of million dollars in programmer salaries to migrate away from. Eventually, it'll happen, but not currently.
All of the cool kids only care care about OpenGL ES 2.0 (and soon ES 3.0 which is in testing), and you can get decent performance with the open source drivers based around libGL-mesa. For fullscreen graphics that works from a terminal without a window manager, use udev + libdrm + EGL + OpenGL ES 2.0 directly. For Windowed graphics, build and install Wayland and then use Wayland + EGL + OpenGL ES 2.0.