Rust-Spotifyd: Spotifyd — An open source Spotify client running as a UNIX daemon.

Spotifyd

Dependabot Status Github Actions - CD Github Actions - CI

An open source Spotify client running as a UNIX daemon.

Spotifyd streams music just like the official client, but is more lightweight and supports more platforms. Spotifyd also supports the Spotify Connect protocol, which makes it show up as a device that can be controlled from the official clients.

Note: Spotifyd requires a Spotify Premium account.

Installation

Provided binaries

We provide pre-built binaries through GitHub Actions for the more popular platforms: Linux, macOS and ARMv7. You can find them here. For extra integrity, the file's SHA-512 gets calculated and uploaded as well.

The provided binaries come in two flavours, slim and full. Each are compiled with different features. slim only contains the platform's most used audio backend, full has also all optional features enabled (see Feature Flags).

Compiling from source

You can also compile Spotifyd yourself, allowing you to make use of feature flags. Spotifyd is written in Rust. You can download the toolchain (compiler and package manager) over at rustup.rs. Follow their instructions to get started.

Note: Please make sure that you compile the package using the most recent stable verison of Rust available throug rustup. Some distro versions are quite outdated and might result in compilation errors.

Spotifyd might require additional libraries during build and runtime, depending on your platform and the way to compile it (static or dynamic). The following table shows the libraries needed for each OS respectively.

Target Platform Libraries
Fedora alsa-lib-devel, make, gcc
openSUSE alsa-devel, make, gcc
Debian libasound2-dev libssl-dev libpulse-dev libdbus-1-dev
macOS dbus, pkg-config, portaudio

Note: The package names for Linux are the ones used on Debian based distributions (like Ubuntu). You will need to adapt the packages for your distribution respectively.

To compile the binary, run

cargo build --release

To install the resulting binary, run

cargo install --path .

Building a Debian package

You can use the cargo-deb create in order to build a Debian package from source. Install it by:

$ cargo install cargo-deb

Then you can build and install the Debian package with:

$ cargo deb --install

Note, that when building a Debian package, the --release is passed to the build command already and you do not need to specify it yourself. See for the flags that are set by default in Cargo.toml.

Feature Flags

Spotifyd is split into a base package plus additional features that can be toggled on or off during compilation. Those can be split into two groups: The audio backend features that are responsible for playing back the music and additional functionality features, which enhance your experience using spotifyd.

To enable an additional audio backend, pass <audio_backend_name>_backend as a feature flag. We currently support alsa, pulseaudio and portaudio.

Spotifyd provides the following additional functionality:

Feature Flag Description
dbus_keyring Provides password authentication over the system's keyring (supports all platforms)
dbus_mpris Provides multimedia key support (Linux only)

Note: Compiling Spotifyd with all features and the pulseaudio backend on Ubuntu would result in the following command: cargo build --release --no-default-features --features pulseaudio_backend,dbus_keyring,dbus_mpris

Media controls

Spotifyd implements the MPRIS D-Bus Interface Specification, meaning that it can be controlled by generic media playback controllers such as playerctl as well as some tools specifically designed for use with the official Spotify client such as sp.

Note: Make sure to rename the service name within the sp script to spotifyd!

Although the code greatly improved, this feature is still considered experimental. Make sure to open an issue if you encounter any issues while using other players to control spotifyd.

Audio Backends

By default, the audio backend is ALSA, as ALSA is available by default on a lot of machines and usually doesn't require extra dependencies. There is also support for pulseaudio and portaudio.

Note: To disable this audio backend, pass --no-default-features down during compilation.

PulseAudio

To use PulseAudio, compile with the --features flag to enable it:

cargo build --release --features "pulseaudio_backend"

You will need the development package for PulseAudio, as well as build-essential or the equivalent package of your distribution.

PortAudio

To use PortAudio (works on macOS), compile with the --features flag to enable it:

cargo build --release --no-default-features --features="portaudio_backend"

Note: It is important that you also pass down --no-default-features as macOS doesn't support the alsa_backend feature!

Rodio

To use Rodio (works on Windows, OSX, Linux), compile with the --features flag to enable it:

cargo build --release --no-default-features --features="rodio_backend"

On Linux you will need the development package for alsa and make/gcc. (libasound2-dev,build-essential on debian, alsa-lib-devel,make,gcc on fedora)

Configuration

Spotifyd is able to run without configuration at all and will assume default values for most of the fields. However, running without configuration will only allow you to connect to it via Spotify Connect if you're on the same network as the daemon.

CLI options

Spotifyd can be configured using CLI arguments. For a detailed description as well as possible values for each flag, run

spotifyd --help

Configuration file

Spotifyd is able to load configuration values from a file too. The file has to be named spotifyd.conf and reside in the user's configuration directory (~/.config/spotifyd) or the system configuration directory (/etc or /etc/xdg/spotifyd). This also applies to macOS!

The configuration file consists of two sections, global and spotifyd, whereas spotifyd takes priority over global.

The configuration file has the following format:

[global]
# Your Spotify account name.
username = username

# Your Spotify account password.
password = password

# A command that gets executed and can be used to
# retrieve your password.
# The command should return the password on stdout.
#
# This is an alternative to the `password` field. Both
# can't be used simultaneously.
password_cmd = command_that_writes_password_to_stdout

# If set to true, `spotifyd` tries to look up your
# password in the system's password storage.
#
# This is an alternative to the `password` field. Both
# can't be used simultaneously.
use_keyring = true

# The audio backend used to play the your music. To get
# a list of possible backends, run `spotifyd --help`.
backend = alsa

# The alsa audio device to stream audio to. To get a
# list of valid devices, run `aplay -L`,
device = alsa_audio_device  # omit for macOS

# The alsa control device. By default this is the same
# name as the `device` field.
control = alsa_audio_device  # omit for macOS

# The alsa mixer used by `spotifyd`.
mixer = PCM

# The volume controller. Each one behaves different to
# volume increases. For possible values, run
# `spotifyd --help`.
volume_controller = alsa  # use softvol for macOS

# A command that gets executed in your shell after each song changes.
on_song_change_hook = command_to_run_on_playback_events

# The name that gets displayed under the connect tab on
# official clients. Spaces are not allowed!
device_name = device_name_in_spotify_connect

# The audio bitrate. 96, 160 or 320 kbit/s
bitrate = 160

# The director used to cache audio data. This setting can save
# a lot of bandwidth when activated, as it will avoid re-downloading
# audio files when replaying them.
#
# Note: The file path does not get expanded. Environment variables and
# shell placeholders like $HOME or ~ don't work!
cache_path = cache_directory

# If set to true, audio data does NOT get cached.
no_audio_cache = true

# If set to true, enables volume normalisation between songs.
volume_normalisation = true

# The normalisation pregain that is applied for each song.
normalisation_pregain = -10

# The port `spotifyd` uses to announce its service over the network.
zeroconf_port = 1234

# The proxy `spotifyd` will use to connect to spotify.
proxy = http://proxy.example.org:8080

# The displayed device type in Spotify clients.
# Can be unknown, computer, tablet, smartphone, speaker, tv,
# avr (Audio/Video Receiver), stb (Set-Top Box), and audiodongle.
device_type = speaker

Alternatives to storing your password in the config file

  • password_cmd config entry

    This feature allows you to provide a command that prints your password to stdout, which saves you from having to store your password in the config file directly. To use it, set the password_cmd config entry to the command you would like to use and remove the password config entry.

    For example (using the password-management utility pass).

    # ~/.config/spotifyd/spotifyd.conf
    password_cmd = pass spotify
  • use_keyring config entry / --use-keyring CLI flag

    This features leverages Linux's DBus Secret Service API or native macOS keychain in order to forgo the need to store your password directly in the config file. To use it, complile with the dbus_keyring feature and set the use-keyring config entry to true or pass the --use-keyring CLI flag during start to the daemon. Remove the password and/or password_cmd config entries.

    Your keyring entry needs to have the following attributes set:

    application: rust-keyring
    service: spotifyd
    username: <your-spotify-username>
    

    To add such an entry into your keyring, you can use secret-tool, a CLI used to communicate with agents that support the Secret Service API:

    secret-tool store --label='name you choose' application rust-keyring service spotifyd username <your-username>

    You can use the keychain GUI on macOS to add an item respectively, or with the built-in security tool:

    security add-generic-password -s spotifyd -D rust-keyring -a <your username> -w

Shell used to run commands indicated by password_cmd or on_song_changed_hook

If either of these options is given, the shell spotifyd will use to run its commands is the shell indicated by the SHELL environment variable, if set. If the SHELL environment variable is not set, spotifyd will use the user's default shell, which, on Linux and BSD, is the shell listed in /etc/passwd. On macOS it is the shell listed in the output of dscl . -read /Users/<username> UserShell.

Running as a system service

on Linux

A systemd.service unit file is provided to help run spotifyd as a service on systemd-based systems. The file contrib/spotifyd.service should be copied to either:

/etc/systemd/user/
~/.config/systemd/user/

Packagers of systemd-based distributions are encouraged to include the file in the former location. End-user should prefer the latter. It should be noted that some targets are not available when running under the user directory, such as network-online.target.

Control of the daemon is handed over to systemd. The following example commands will run the service once and enable the service to always run on login in the future respectively:

systemctl --user start spotifyd.service
systemctl --user enable spotifyd.service

on macOS

On macOS the system wide and per-user daemon/agent manager is known as launchd. Interfacing with launchd is performed through launchctl.

In order to use spotifyd as a service on macOS one must specify a .plist that represents the service, and place it in /Library/LaunchDaemons.

Here is a .plist which works with macOS Catalina 10.15.3:

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
	<dict>
		<key>Label</key>
		<string>rustlang.spotifyd</string>
		<key>ProgramArguments</key>
		<array>
			<string>/usr/local/bin/spotifyd</string>
			<string>--config-path=/users/YourUserName/.config/spotifyd/spotifyd.conf</string>
			<string>--no-daemon</string>
		</array>
		<key>UserName</key>
		<string>YourUserName</string>
		<key>KeepAlive</key>
		<true/>
		<key>ThrottleInterval</key>
		<integer>30</integer>
	</dict>
</plist>

Once present in the /Library/LaunchDaemons directory, the .plist must be loaded and started with the following commands.

sudo launchctl load -w /Library/LaunchDaemons/rustlang.spotifyd.plist

sudo launchctl start /Library/LaunchDaemons/rustlang.spotifyd.plist

One may also unload/stop the service in a similar fashion replacing load/start with unload/stop.

Note:

  • You should update "YourUserName" with your actual username for macOS (or remove "UserName" to run as root.

  • The string, --no-daemon is needed as launchd won't receive a PID for the process and will lose its remit over spotifyd. So it's best to include it, there will be no difference in use, nor will you see any log output.

  • macOS tries to start the daemon immediately on boot, and spotifyd fails if Wifi isn't connected. So one must have a keep alive (which retries if it fails to launch on boot), that retries after 30 seconds, which is enough for wifi etc to come up.

Common issues

  • Spotifyd will not work without Spotify Premium
  • The device name cannot contain spaces

Contributing

We always appreciate help during the development of spotifyd! If you are new to programming, open source or Rust in general, take a look at issues tagged with good first issue. These normally are easy to resolve and don't take much time to implement.

Credits

This project would not have been possible without the amazing reverse engineering work done in librespot, mostly by plietar.

Comments

  • spotifyd exits when other user in WiFi selects it as output device in spotify app
    spotifyd exits when other user in WiFi selects it as output device in spotify app

    Jan 12, 2022

    Description My instance of spotifyd appears as selectable device in the spotify android app for other users in the same WiFi (not logged into my spotify account!). When selected as playback device, spotifyd on my machine terminates. Selecting it as playback device from my phone (where the same spotify account is logged in) works fine.

    To Reproduce

    1. Need 2 different spotify accounts, referred to below as account A and account B.
    2. Start spotifyd on one machine, setup with spotify account A.
    3. Open the spotify app on an android phone in the same WiFi, setup with spotify account B.
    4. When playing a song on the phone, the spotifyd instance can be selected in the "devices" overview.
    5. When selecting it as playback device on the phone, spotifyd on the first machine exits.

    Expected behavior I would expect spotifyd to just reject the request, certainly not crash.

    Logs

    Click to show logs
    <!-- PLEASE PASTE YOUR LOGS BELOW THIS LINE WHEN REPORTING BUGS. Make sure to run spotifyd using the `--verbose` flag -->
    Loading config from "/home/achim/.config/spotifyd/spotifyd.conf"
    CliConfig { config_path: None, no_daemon: true, verbose: true, pid: None, shared_config: SharedConfigValues { username: Some("taken out for privacy"), username_cmd: None, password: None, password_cmd: None, use_keyring: true, use_mpris: None, on_song_change_hook: None, cache_path: None, no-audio-cache: false, backend: Some(PulseAudio), volume_controller: None, device: None, control: None, mixer: None, device_name: None, bitrate: None, initial_volume: None, volume_normalisation: false, normalisation_pregain: None, zeroconf_port: None, proxy: None, device_type: Some(Computer) } }
    Found user shell: Some("/bin/bash")
    No password specified. Checking password_cmd
    No password_cmd specified
    No proxy specified
    registering event source with poller: token=Token(0), interests=READABLE | WRITABLE
    Using software volume controller.
    registering event source with poller: token=Token(1), interests=READABLE | WRITABLE
    Zeroconf server listening on 0.0.0.0:37717
    registering event source with poller: token=Token(2), interests=READABLE | WRITABLE
    registering event source with poller: token=Token(3), interests=READABLE | WRITABLE
    Checking keyring for password
    found interface Interface { name: "wlp3s0", addr: V4(Ifv4Addr { ip: 192.168.0.8, netmask: 255.255.255.0, broadcast: Some(192.168.0.255) }) }
    found interface Interface { name: "br-b2178398e9dc", addr: V4(Ifv4Addr { ip: 172.21.0.1, netmask: 255.255.0.0, broadcast: Some(172.21.255.255) }) }
    found interface Interface { name: "docker0", addr: V4(Ifv4Addr { ip: 172.17.0.1, netmask: 255.255.0.0, broadcast: Some(172.17.255.255) }) }
    found interface Interface { name: "tun0", addr: V4(Ifv4Addr { ip: 10.8.0.14, netmask: 255.255.255.255, broadcast: None }) }
    found interface Interface { name: "wlp3s0", addr: V6(Ifv6Addr { ip: 2a02:908:d81:e040:aeed:5cff:fe71:93ff, netmask: ffff:ffff:ffff:ffff::, broadcast: None }) }
    sending packet to 224.0.0.251:5353
    found interface Interface { name: "wlp3s0", addr: V4(Ifv4Addr { ip: 192.168.0.8, netmask: 255.255.255.0, broadcast: Some(192.168.0.255) }) }
    found interface Interface { name: "br-b2178398e9dc", addr: V4(Ifv4Addr { ip: 172.21.0.1, netmask: 255.255.0.0, broadcast: Some(172.21.255.255) }) }
    found interface Interface { name: "docker0", addr: V4(Ifv4Addr { ip: 172.17.0.1, netmask: 255.255.0.0, broadcast: Some(172.17.255.255) }) }
    found interface Interface { name: "tun0", addr: V4(Ifv4Addr { ip: 10.8.0.14, netmask: 255.255.255.255, broadcast: None }) }
    found interface Interface { name: "wlp3s0", addr: V6(Ifv6Addr { ip: 2a02:908:d81:e040:aeed:5cff:fe71:93ff, netmask: ffff:ffff:ffff:ffff::, broadcast: None }) }
    sending packet to [ff02::fb]:5353
    received packet from 192.168.0.8:5353
    received packet from 192.168.0.8:5353 with no query
    received packet from [fe80::aeed:5cff:fe71:93ff%3]:5353
    received packet from [fe80::aeed:5cff:fe71:93ff%3]:5353 with no query
    Connecting to AP "ap.spotify.com:443"
    registering event source with poller: token=Token(4), interests=READABLE | WRITABLE
    flushing framed transport
    writing; remaining=161
    framed transport flushed
    attempting to decode a frame
    frame decoded from buffer
    Authenticated as "zexp2flxvsi8valjmjqmlazg3" !
    new Session[0]
    flushing framed transport
    framed transport flushed
    attempting to decode a frame
    frame decoded from buffer
    Session[0] strong=4 weak=1
    attempting to decode a frame
    frame decoded from buffer
    attempting to decode a frame
    frame decoded from buffer
    attempting to decode a frame
    frame decoded from buffer
    Country: "DE"
    attempting to decode a frame
    new Player[0]
    Using PulseAudio sink with format: S16
    new Spirc[0]
    canonical_username: zexp2flxvsi8valjmjqmlazg3
    new MercuryManager
    frame decoded from buffer
    attempting to decode a frame
    frame decoded from buffer
    attempting to decode a frame
    frame decoded from buffer
    attempting to decode a frame
    input volume:65535 to mixer: 65535
    flushing framed transport
    writing; remaining=83
    framed transport flushed
    flushing framed transport
    writing; remaining=378
    framed transport flushed
    command=AddEventSender
    command=VolumeSet(65535)
    flushing framed transport
    writing; remaining=438
    framed transport flushed
    flushing framed transport
    framed transport flushed
    attempting to decode a frame
    frame decoded from buffer
    attempting to decode a frame
    flushing framed transport
    framed transport flushed
    attempting to decode a frame
    frame decoded from buffer
    attempting to decode a frame
    subscribed uri=hm://remote/user/zexp2flxvsi8valjmjqmlazg3/ count=0
    flushing framed transport
    framed transport flushed
    attempting to decode a frame
    frame decoded from buffer
    attempting to decode a frame
    flushing framed transport
    framed transport flushed
    attempting to decode a frame
    attempting to decode a frame
    frame decoded from buffer
    attempting to decode a frame
    registering event source with poller: token=Token(5), interests=READABLE
    received packet from 192.168.0.244:5353
    received question: IN {"nm":"Redmi 9","as":"[8193, 8194]","ip":"244"}._mi-connect._udp.local
    received packet from [fe80::acc3:1c85:eb00:de1e%3]:5353
    received question: IN {"nm":"Redmi 9","as":"[8193, 8194]","ip":"244"}._mi-connect._udp.local
    received packet from 192.168.0.244:5353
    received question: IN {"nm":"Redmi 9","as":"[8193, 8194]","ip":"244"}._mi-connect._udp.local
    received packet from [fe80::acc3:1c85:eb00:de1e%3]:5353
    received question: IN {"nm":"Redmi 9","as":"[8193, 8194]","ip":"244"}._mi-connect._udp.local
    received packet from 192.168.0.244:5353
    received question: IN {"nm":"Redmi 9","as":"[8193, 8194]","ip":"244"}._mi-connect._udp.local
    received packet from [fe80::acc3:1c85:eb00:de1e%3]:5353
    received question: IN {"nm":"Redmi 9","as":"[8193, 8194]","ip":"244"}._mi-connect._udp.local
    received packet from 192.168.0.244:5353
    received packet from 192.168.0.244:5353 with no query
    received packet from [fe80::acc3:1c85:eb00:de1e%3]:5353
    received packet from [fe80::acc3:1c85:eb00:de1e%3]:5353 with no query
    received packet from 192.168.0.244:5353
    received packet from 192.168.0.244:5353 with no query
    received packet from [fe80::acc3:1c85:eb00:de1e%3]:5353
    received packet from [fe80::acc3:1c85:eb00:de1e%3]:5353 with no query
    received packet from 192.168.0.244:5353
    received question: IN _spotify-connect._tcp.local
    found interface Interface { name: "wlp3s0", addr: V4(Ifv4Addr { ip: 192.168.0.8, netmask: 255.255.255.0, broadcast: Some(192.168.0.255) }) }
    found interface Interface { name: "br-b2178398e9dc", addr: V4(Ifv4Addr { ip: 172.21.0.1, netmask: 255.255.0.0, broadcast: Some(172.21.255.255) }) }
    found interface Interface { name: "docker0", addr: V4(Ifv4Addr { ip: 172.17.0.1, netmask: 255.255.0.0, broadcast: Some(172.17.255.255) }) }
    found interface Interface { name: "tun0", addr: V4(Ifv4Addr { ip: 10.8.0.14, netmask: 255.255.255.255, broadcast: None }) }
    found interface Interface { name: "wlp3s0", addr: V6(Ifv6Addr { ip: 2a02:908:d81:e040:aeed:5cff:fe71:93ff, netmask: ffff:ffff:ffff:ffff::, broadcast: None }) }
    sending packet to 224.0.0.251:5353
    received packet from [fe80::acc3:1c85:eb00:de1e%3]:5353
    received question: IN _spotify-connect._tcp.local
    found interface Interface { name: "wlp3s0", addr: V4(Ifv4Addr { ip: 192.168.0.8, netmask: 255.255.255.0, broadcast: Some(192.168.0.255) }) }
    found interface Interface { name: "br-b2178398e9dc", addr: V4(Ifv4Addr { ip: 172.21.0.1, netmask: 255.255.0.0, broadcast: Some(172.21.255.255) }) }
    found interface Interface { name: "docker0", addr: V4(Ifv4Addr { ip: 172.17.0.1, netmask: 255.255.0.0, broadcast: Some(172.17.255.255) }) }
    found interface Interface { name: "tun0", addr: V4(Ifv4Addr { ip: 10.8.0.14, netmask: 255.255.255.255, broadcast: None }) }
    found interface Interface { name: "wlp3s0", addr: V6(Ifv6Addr { ip: 2a02:908:d81:e040:aeed:5cff:fe71:93ff, netmask: ffff:ffff:ffff:ffff::, broadcast: None }) }
    sending packet to [ff02::fb]:5353
    received packet from 192.168.0.8:5353
    received packet from 192.168.0.8:5353 with no query
    received packet from [fe80::aeed:5cff:fe71:93ff%3]:5353
    received packet from [fe80::aeed:5cff:fe71:93ff%3]:5353 with no query
    received packet from 192.168.0.244:5353
    received question: IN _spotify-connect._tcp.local
    found interface Interface { name: "wlp3s0", addr: V4(Ifv4Addr { ip: 192.168.0.8, netmask: 255.255.255.0, broadcast: Some(192.168.0.255) }) }
    found interface Interface { name: "br-b2178398e9dc", addr: V4(Ifv4Addr { ip: 172.21.0.1, netmask: 255.255.0.0, broadcast: Some(172.21.255.255) }) }
    found interface Interface { name: "docker0", addr: V4(Ifv4Addr { ip: 172.17.0.1, netmask: 255.255.0.0, broadcast: Some(172.17.255.255) }) }
    found interface Interface { name: "tun0", addr: V4(Ifv4Addr { ip: 10.8.0.14, netmask: 255.255.255.255, broadcast: None }) }
    found interface Interface { name: "wlp3s0", addr: V6(Ifv6Addr { ip: 2a02:908:d81:e040:aeed:5cff:fe71:93ff, netmask: ffff:ffff:ffff:ffff::, broadcast: None }) }
    sending packet to 224.0.0.251:5353
    received packet from [fe80::acc3:1c85:eb00:de1e%3]:5353
    received question: IN _spotify-connect._tcp.local
    found interface Interface { name: "wlp3s0", addr: V4(Ifv4Addr { ip: 192.168.0.8, netmask: 255.255.255.0, broadcast: Some(192.168.0.255) }) }
    found interface Interface { name: "br-b2178398e9dc", addr: V4(Ifv4Addr { ip: 172.21.0.1, netmask: 255.255.0.0, broadcast: Some(172.21.255.255) }) }
    found interface Interface { name: "docker0", addr: V4(Ifv4Addr { ip: 172.17.0.1, netmask: 255.255.0.0, broadcast: Some(172.17.255.255) }) }
    found interface Interface { name: "tun0", addr: V4(Ifv4Addr { ip: 10.8.0.14, netmask: 255.255.255.255, broadcast: None }) }
    found interface Interface { name: "wlp3s0", addr: V6(Ifv6Addr { ip: 2a02:908:d81:e040:aeed:5cff:fe71:93ff, netmask: ffff:ffff:ffff:ffff::, broadcast: None }) }
    sending packet to [ff02::fb]:5353
    received packet from 192.168.0.8:5353
    received packet from 192.168.0.8:5353 with no query
    received packet from [fe80::aeed:5cff:fe71:93ff%3]:5353
    received packet from [fe80::aeed:5cff:fe71:93ff%3]:5353 with no query
    registering event source with poller: token=Token(6), interests=READABLE | WRITABLE
    deregistering event source from poller
    received packet from 192.168.0.244:5353
    received question: IN _CC1AD845._sub._googlecast._tcp.local
    received question: IN _CC32E753._sub._googlecast._tcp.local
    received question: IN _googlecast._tcp.local
    received packet from 192.168.0.244:5353
    received question: IN _CC1AD845._sub._googlecast._tcp.local
    received question: IN _CC32E753._sub._googlecast._tcp.local
    received question: IN _googlecast._tcp.local
    received packet from 192.168.0.244:5353
    received question: IN _spotify-connect._tcp.local
    found interface Interface { name: "wlp3s0", addr: V4(Ifv4Addr { ip: 192.168.0.8, netmask: 255.255.255.0, broadcast: Some(192.168.0.255) }) }
    found interface Interface { name: "br-b2178398e9dc", addr: V4(Ifv4Addr { ip: 172.21.0.1, netmask: 255.255.0.0, broadcast: Some(172.21.255.255) }) }
    found interface Interface { name: "docker0", addr: V4(Ifv4Addr { ip: 172.17.0.1, netmask: 255.255.0.0, broadcast: Some(172.17.255.255) }) }
    found interface Interface { name: "tun0", addr: V4(Ifv4Addr { ip: 10.8.0.14, netmask: 255.255.255.255, broadcast: None }) }
    found interface Interface { name: "wlp3s0", addr: V6(Ifv6Addr { ip: 2a02:908:d81:e040:aeed:5cff:fe71:93ff, netmask: ffff:ffff:ffff:ffff::, broadcast: None }) }
    sending packet to 224.0.0.251:5353
    received packet from [fe80::acc3:1c85:eb00:de1e%3]:5353
    received question: IN _spotify-connect._tcp.local
    found interface Interface { name: "wlp3s0", addr: V4(Ifv4Addr { ip: 192.168.0.8, netmask: 255.255.255.0, broadcast: Some(192.168.0.255) }) }
    found interface Interface { name: "br-b2178398e9dc", addr: V4(Ifv4Addr { ip: 172.21.0.1, netmask: 255.255.0.0, broadcast: Some(172.21.255.255) }) }
    found interface Interface { name: "docker0", addr: V4(Ifv4Addr { ip: 172.17.0.1, netmask: 255.255.0.0, broadcast: Some(172.17.255.255) }) }
    found interface Interface { name: "tun0", addr: V4(Ifv4Addr { ip: 10.8.0.14, netmask: 255.255.255.255, broadcast: None }) }
    found interface Interface { name: "wlp3s0", addr: V6(Ifv6Addr { ip: 2a02:908:d81:e040:aeed:5cff:fe71:93ff, netmask: ffff:ffff:ffff:ffff::, broadcast: None }) }
    sending packet to [ff02::fb]:5353
    received packet from 192.168.0.8:5353
    received packet from 192.168.0.8:5353 with no query
    received packet from [fe80::aeed:5cff:fe71:93ff%3]:5353
    received packet from [fe80::aeed:5cff:fe71:93ff%3]:5353 with no query
    received packet from 192.168.0.244:5353
    received question: IN _spotify-connect._tcp.local
    found interface Interface { name: "wlp3s0", addr: V4(Ifv4Addr { ip: 192.168.0.8, netmask: 255.255.255.0, broadcast: Some(192.168.0.255) }) }
    found interface Interface { name: "br-b2178398e9dc", addr: V4(Ifv4Addr { ip: 172.21.0.1, netmask: 255.255.0.0, broadcast: Some(172.21.255.255) }) }
    found interface Interface { name: "docker0", addr: V4(Ifv4Addr { ip: 172.17.0.1, netmask: 255.255.0.0, broadcast: Some(172.17.255.255) }) }
    found interface Interface { name: "tun0", addr: V4(Ifv4Addr { ip: 10.8.0.14, netmask: 255.255.255.255, broadcast: None }) }
    found interface Interface { name: "wlp3s0", addr: V6(Ifv6Addr { ip: 2a02:908:d81:e040:aeed:5cff:fe71:93ff, netmask: ffff:ffff:ffff:ffff::, broadcast: None }) }
    sending packet to 224.0.0.251:5353
    received packet from [fe80::acc3:1c85:eb00:de1e%3]:5353
    received question: IN _spotify-connect._tcp.local
    found interface Interface { name: "wlp3s0", addr: V4(Ifv4Addr { ip: 192.168.0.8, netmask: 255.255.255.0, broadcast: Some(192.168.0.255) }) }
    found interface Interface { name: "br-b2178398e9dc", addr: V4(Ifv4Addr { ip: 172.21.0.1, netmask: 255.255.0.0, broadcast: Some(172.21.255.255) }) }
    found interface Interface { name: "docker0", addr: V4(Ifv4Addr { ip: 172.17.0.1, netmask: 255.255.0.0, broadcast: Some(172.17.255.255) }) }
    found interface Interface { name: "tun0", addr: V4(Ifv4Addr { ip: 10.8.0.14, netmask: 255.255.255.255, broadcast: None }) }
    found interface Interface { name: "wlp3s0", addr: V6(Ifv6Addr { ip: 2a02:908:d81:e040:aeed:5cff:fe71:93ff, netmask: ffff:ffff:ffff:ffff::, broadcast: None }) }
    sending packet to [ff02::fb]:5353
    received packet from 192.168.0.8:5353
    received packet from 192.168.0.8:5353 with no query
    received packet from [fe80::aeed:5cff:fe71:93ff%3]:5353
    received packet from [fe80::aeed:5cff:fe71:93ff%3]:5353 with no query
    registering event source with poller: token=Token(16777222), interests=READABLE | WRITABLE
    deregistering event source from poller
    received packet from 192.168.0.244:5353
    received question: IN _CC1AD845._sub._googlecast._tcp.local
    received question: IN _CC32E753._sub._googlecast._tcp.local
    received question: IN _googlecast._tcp.local
    received packet from 192.168.0.244:5353
    received packet from 192.168.0.244:5353 with no query
    received packet from [fe80::acc3:1c85:eb00:de1e%3]:5353
    received packet from [fe80::acc3:1c85:eb00:de1e%3]:5353 with no query
    registering event source with poller: token=Token(33554438), interests=READABLE | WRITABLE
    received packet from 192.168.0.244:5353
    received question: IN _CC1AD845._sub._googlecast._tcp.local
    received question: IN _CC32E753._sub._googlecast._tcp.local
    received question: IN _googlecast._tcp.local
    deregistering event source from poller
    received packet from 192.168.0.244:5353
    received question: IN _spotify-connect._tcp.local
    found interface Interface { name: "wlp3s0", addr: V4(Ifv4Addr { ip: 192.168.0.8, netmask: 255.255.255.0, broadcast: Some(192.168.0.255) }) }
    found interface Interface { name: "br-b2178398e9dc", addr: V4(Ifv4Addr { ip: 172.21.0.1, netmask: 255.255.0.0, broadcast: Some(172.21.255.255) }) }
    found interface Interface { name: "docker0", addr: V4(Ifv4Addr { ip: 172.17.0.1, netmask: 255.255.0.0, broadcast: Some(172.17.255.255) }) }
    found interface Interface { name: "tun0", addr: V4(Ifv4Addr { ip: 10.8.0.14, netmask: 255.255.255.255, broadcast: None }) }
    found interface Interface { name: "wlp3s0", addr: V6(Ifv6Addr { ip: 2a02:908:d81:e040:aeed:5cff:fe71:93ff, netmask: ffff:ffff:ffff:ffff::, broadcast: None }) }
    sending packet to 224.0.0.251:5353
    received packet from [fe80::acc3:1c85:eb00:de1e%3]:5353
    received question: IN _spotify-connect._tcp.local
    found interface Interface { name: "wlp3s0", addr: V4(Ifv4Addr { ip: 192.168.0.8, netmask: 255.255.255.0, broadcast: Some(192.168.0.255) }) }
    found interface Interface { name: "br-b2178398e9dc", addr: V4(Ifv4Addr { ip: 172.21.0.1, netmask: 255.255.0.0, broadcast: Some(172.21.255.255) }) }
    found interface Interface { name: "docker0", addr: V4(Ifv4Addr { ip: 172.17.0.1, netmask: 255.255.0.0, broadcast: Some(172.17.255.255) }) }
    found interface Interface { name: "tun0", addr: V4(Ifv4Addr { ip: 10.8.0.14, netmask: 255.255.255.255, broadcast: None }) }
    found interface Interface { name: "wlp3s0", addr: V6(Ifv6Addr { ip: 2a02:908:d81:e040:aeed:5cff:fe71:93ff, netmask: ffff:ffff:ffff:ffff::, broadcast: None }) }
    sending packet to [ff02::fb]:5353
    received packet from 192.168.0.8:5353
    received packet from 192.168.0.8:5353 with no query
    received packet from [fe80::aeed:5cff:fe71:93ff%3]:5353
    received packet from [fe80::aeed:5cff:fe71:93ff%3]:5353 with no query
    registering event source with poller: token=Token(50331654), interests=READABLE | WRITABLE
    deregistering event source from poller
    registering event source with poller: token=Token(67108870), interests=READABLE | WRITABLE
    POST "/" {}
    Connecting to AP "ap.spotify.com:443"
    registering event source with poller: token=Token(7), interests=READABLE | WRITABLE
    flushing framed transport
    writing; remaining=378
    framed transport flushed
    deregistering event source from poller
    registering event source with poller: token=Token(83886086), interests=READABLE | WRITABLE
    deregistering event source from poller
    flushing framed transport
    framed transport flushed
    attempting to decode a frame
    frame decoded from buffer
    attempting to decode a frame
    drop Spirc[0]
    Shutting down player thread ...
    drop PlayerInternal[0]
    PlayerInternal thread finished.
    deregistering event source from poller
    drop Session[0]
    drop MercuryManager
    deregistering event source from poller
    drop Dispatch
    deregistering event source from poller
    deregistering event source from poller
    deregistering event source from poller
    deregistering event source from poller
    <!-- They have to be betwenn the `/summary` and the `/details` HTML tags, and inside the backticks '`' -->  
    

    Compilation flags

    • [ ] dbus_mpris
    • [ ] dbus_keyring
    • [x] alsa_backend
    • [ ] portaudio_backend
    • [ ] pulseaudio_backend
    • [ ] rodio_backend

    Versions (please complete the following information):

    • OS: 5.15.13-arch1-1
    • Spotifyd: 0.3.3
    • cargo: cargo 1.53.0 (4369396ce 2021-04-27)
    bug 
    Reply
  • Occasional crashes in 0.3.3
    Occasional crashes in 0.3.3

    Jan 17, 2022

    Description Occasionally spotifyd crashes. More frequently in the last couple of days - possibly spotify changing something on the backend?

    To Reproduce Leave the music on for a few hours

    Expected behavior spotifyd does not crash :-)

    Logs

    Click to show logs
    Jan 17 09:37:19 nuc spotifyd[55192]: Loading <RUSSIAN DRIFT> with Spotify URI <spotify:track:3ONCxAVj5oNNKMH6vHy73W>
    Jan 17 09:37:19 nuc spotifyd[55192]: <RUSSIAN DRIFT> (136660 ms) loaded
    Jan 17 09:39:35 nuc spotifyd[55192]: Loading <Sharingan> with Spotify URI <spotify:track:7u4zUJKATHBeymXMMOXnjJ>
    Jan 17 09:39:35 nuc spotifyd[55192]: <Sharingan> (225985 ms) loaded
    Jan 17 09:43:21 nuc spotifyd[55192]: Loading <On Jet.> with Spotify URI <spotify:track:5T3DBjo3kubrp1KDNrRsYi>
    Jan 17 09:43:21 nuc spotifyd[55192]: <On Jet.> (169500 ms) loaded
    Jan 17 09:46:11 nuc spotifyd[55192]: Loading <Melancholy> with Spotify URI <spotify:track:67J9oAUsdZEsXpELyhPiSD>
    Jan 17 09:46:11 nuc spotifyd[55192]: <Melancholy> (205440 ms) loaded
    Jan 17 09:49:36 nuc spotifyd[55192]: Loading <ERA SVOA> with Spotify URI <spotify:track:0D1orB0F70kDzkUeJgXSe3>
    Jan 17 09:49:37 nuc spotifyd[55192]: <ERA SVOA> (171356 ms) loaded
    Jan 17 09:52:27 nuc spotifyd[55192]: PANIC: Shutting down spotifyd. Error message: index out of bounds: the len is 19 but the index is 18446744073709551612
    

    Compilation flags

    • [ ] dbus_mpris
    • [ ] dbus_keyring
    • [x] alsa_backend
    • [ ] portaudio_backend
    • [ ] pulseaudio_backend
    • [ ] rodio_backend

    Versions (please complete the following information):

    • OS: Ubuntu 20.04.3
    • Spotifyd: 0.3.3
    • cargo: N/A
    bug 
    Reply
  • fix documentation for autoplay option
    fix documentation for autoplay option

    Jan 19, 2022

    Instead of the previously advertised "autoplay on connect", autoplay makes the playback continue after the user's selection has ended. I also added the option to the config file and documented it there. Apparently that was all missed, when the option was initially added (2b91a0c11d01807a3220d1177008c80a8dcc2662).

    For reference, see e.g. librespots's documentation for that flag.

    @Spotifyd/maintainers, it would be great, if someone could review this PR!

    related: #1003

    Reply
  • Print failed authentication to console
    Print failed authentication to console

    Jan 21, 2022

    Hello, first of all thanks for the great software, it is very very useful.

    Is your feature request related to a problem? Please describe. Is there a convenient way (other than a timeout) to determine if an authentication failed?

    I tried --debug-credentials, and --verbose to distinguish between a successful and unsuccessful attempt. But there isn't any useful information other than a timeout when waiting for 'Authenticated as "username" !'.

    Describe the solution you'd like It would be very cool, if there is a message for failed authentication (when using [cached] credentials ).

    Describe alternatives you've considered Alternativly, password_cmd is called AFTER and if and only if cached credentials failed.

    Additional context I use zenity to prompt via 'password_cmd', which works as expected. When i log in the second time, i have to find out somehow that the cached credentials are still valid (any idea how to do this?). Because i don't want to prompt the password every time i connect.

    Why do i have to login? I only see my device with 'spotipy' (sic!), when i successfully authenticated.

    enhancement 
    Reply
  • properly handle Session::connect() errors
    properly handle Session::connect() errors

    Jan 21, 2022

    Previously, a SessionError from the connection future was silently ignored and the application just continued to run (waiting for a potential connection from discovery).

    Now, if the connection can't be established, the error is logged and the application shuts down. For example, if you provide a wrong password:

    ➜ target/release/spotifyd --no-daemon                 
    Loading config from "/home/user/.config/spotifyd/spotifyd.conf"
    No password specified. Checking password_cmd
    Running "zenity --password" using "/usr/bin/zsh"
    No proxy specified
    Using software volume controller.
    Connecting to AP "ap.spotify.com:443"
    Failed to connect to spotify: Login failed with reason: Bad credentials
    

    Before:

    ➜ target/release/spotifyd --no-daemon                   
    Loading config from "/home/user/.config/spotifyd/spotifyd.conf"
    No password specified. Checking password_cmd
    Running "zenity --password" using "/usr/bin/zsh"
    No proxy specified
    Using software volume controller.
    Connecting to AP "ap.spotify.com:443"
    # application continues to run, there would not even be a hint about
    # the failed authentication in the debug logs
    

    fixes (maybe): #1040

    Reply
  • Update snapcraft.yaml
    Update snapcraft.yaml

    Jan 21, 2022

    snapcraft.yaml is updated to make possible snapcraft build on Ubuntu 20.04 LTS In scope of this PR:

    • source link to github is added. W/o the direct link the source code is not populated to the build VM and it's not possible to determine the version. If you know a better way -- please tell me, I'll update the PR,
    • added the required dependency libasound2,
    • base core specified explicitly as it is needed for the current snapcraft,
    • service (daemon) is removed from snap. I think that running this daemon under superuser account (even in a confined snap) is not a best option from the security perspective. Also different PC users may require different Spotify premium credentials (like family members with different music taste) and the best place to store the credentials is user's profile or keychain, but spotifyd doesn't support keychains. So the current approach of running spotifyd as a system-wide service is insecure, non-manageable and not suitable for everyone. Hereby it's better to leave such configuration on user's responsibility than introduce the insecure approach.
    Reply
  • Spotifyd crashes without specifying a volume_controller
    Spotifyd crashes without specifying a volume_controller

    Sep 17, 2019

    After building on macos 10.4.6. with portaudio as backend, launching spotifyd results next panic:

    thread 'main' panicked at 'called `Option::unwrap()` on a `None` value', src/libcore/option.rs:347:21
    

    Running with RUST_BACKTRACE=1 shows next stack backtrace:

       0: std::panicking::default_hook::{{closure}}
       1: std::panicking::default_hook
       2: std::panicking::rust_panic_with_hook
       3: std::panicking::continue_panic_fmt
       4: rust_begin_unwind
       5: core::panicking::panic_fmt
       6: core::panicking::panic
       7: spotifyd::config::get_internal_config
       8: spotifyd::main
       9: std::rt::lang_start::{{closure}}
      10: std::panicking::try::do_call
      11: __rust_maybe_catch_panic
      12: std::rt::lang_start_internal
      13: main
    

    I was able to build working version from source on August and after that I've only changed my rust installation from Homebrew version to rustup (guess that's not the problem here?). Any suggestions?

    bug reproducibility: easy 
    Reply
  • [ERROR] Caught panic with message: Authentication failed with reason: BadCredentials
    [ERROR] Caught panic with message: Authentication failed with reason: BadCredentials

    Apr 1, 2018

    Spotifyd worked for 3 days well until one day (2018.04.01) it stopped with error "BadCredentials".

    Credentials were not changed neither in spotifyd or spotify.com.

    I don't know how to look up the version, but my source was Arch linux aur spotifyd-git 0.1.1.2.g7451cd6-1

    log:

    00:05:54 [TRACE] mio::poll: [<unknown>:785] registering with poller
    00:05:54 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:523] build; num-workers=8
    00:05:54 [DEBUG] tokio_reactor::background: starting background reactor
    00:05:54 [INFO] Using software volume controller.
    00:05:54 [DEBUG] librespot_connect::discovery: Zeroconf server listening on 0.0.0.0:0
    00:05:54 [TRACE] mio::poll: [<unknown>:785] registering with poller
    00:05:54 [TRACE] mio::poll: [<unknown>:785] registering with poller
    00:05:54 [TRACE] mio::poll: [<unknown>:785] registering with poller
    00:05:54 [TRACE] tokio_reactor: [<unknown>:330] event Writable Token(2)
    00:05:54 [TRACE] tokio_reactor: [<unknown>:330] event Writable Token(3)
    00:05:54 [DEBUG] tokio_reactor: loop process - 2 events, 0.000s
    00:05:54 [DEBUG] tokio_core::reactor: added a timeout: 0
    00:05:54 [TRACE] mio::poll: [<unknown>:785] registering with poller
    00:05:54 [TRACE] tokio_reactor: [<unknown>:330] event Writable Token(4)
    00:05:54 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s
    00:05:54 [TRACE] hyper::client::pool: [<unknown>:178] park; waiting for idle connection: "http://apresolve.spotify.com"
    00:05:54 [TRACE] hyper::client::connect: [<unknown>:118] Http::connect("http://apresolve.spotify.com/")
    00:05:54 [DEBUG] hyper::client::dns: resolving host="apresolve.spotify.com", port=80
    00:05:54 [DEBUG] tokio_core::reactor: consuming notification queue
    00:05:54 [TRACE] mdns::fsm: [/home/user/.cargo/git/checkouts/rust-mdns-881ed19b93df0e9d/733b2b6/src/fsm.rs:275] sending packet to V4(224.0.0.251:5353)
    00:05:54 [TRACE] tokio_reactor: [<unknown>:330] event Readable | Writable Token(2)
    00:05:54 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s
    00:05:54 [TRACE] mdns::fsm: [/home/user/.cargo/git/checkouts/rust-mdns-881ed19b93df0e9d/733b2b6/src/fsm.rs:275] sending packet to V6([ff02::fb]:5353)
    00:05:54 [TRACE] tokio_reactor: [<unknown>:330] event Readable | Writable Token(2)
    00:05:54 [TRACE] tokio_reactor: [<unknown>:330] event Readable | Writable Token(3)
    00:05:54 [DEBUG] tokio_reactor: loop process - 2 events, 0.000s
    00:05:54 [DEBUG] tokio_core::reactor: updating a timeout: 0
    00:05:54 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 298198 }
    00:05:54 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 274, tv_nsec: 633948051 }
    00:05:54 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 7023 }
    00:05:54 [TRACE] mdns::fsm: [/home/user/.cargo/git/checkouts/rust-mdns-881ed19b93df0e9d/733b2b6/src/fsm.rs:76] received packet from V4(192.168.1.100:5353)
    00:05:54 [TRACE] mdns::fsm: [/home/user/.cargo/git/checkouts/rust-mdns-881ed19b93df0e9d/733b2b6/src/fsm.rs:87] received packet from V4(192.168.1.100:5353) with no query
    00:05:54 [TRACE] mdns::fsm: [/home/user/.cargo/git/checkouts/rust-mdns-881ed19b93df0e9d/733b2b6/src/fsm.rs:76] received packet from V6([fe80::59cc:f6e9:af23:975e]:5353)
    00:05:54 [TRACE] mdns::fsm: [/home/user/.cargo/git/checkouts/rust-mdns-881ed19b93df0e9d/733b2b6/src/fsm.rs:87] received packet from V6([fe80::59cc:f6e9:af23:975e]:5353) with no query
    00:05:54 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 29906 }
    00:05:54 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 274, tv_nsec: 633988276 }
    00:05:54 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 5821 }
    00:05:54 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 340 }
    00:05:54 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 274, tv_nsec: 633997373 }
    00:05:54 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 15680 }
    00:05:54 [TRACE] tokio_reactor: [<unknown>:330] event Writable Token(3)
    00:05:54 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s
    00:05:54 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 87637707 }
    00:05:54 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 274, tv_nsec: 721655288 }
    00:05:54 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 42811 }
    00:05:54 [DEBUG] hyper::client::connect: connecting to 104.199.64.136:80
    00:05:54 [TRACE] mio::poll: [<unknown>:785] registering with poller
    00:05:54 [TRACE] tokio_reactor: [<unknown>:330] event Writable Token(5)
    00:05:54 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s
    00:05:54 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 40125343 }
    00:05:54 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 274, tv_nsec: 761954086 }
    00:05:54 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 13345 }
    00:05:54 [TRACE] hyper::proto::h1::dispatch: [<unknown>:274] Dispatcher::poll
    00:05:54 [TRACE] hyper::proto::h1::conn: [<unknown>:284] read_keep_alive; is_mid_message=false
    00:05:54 [TRACE] hyper::proto: [<unknown>:122] should_keep_alive(version=Http11, header=None) = true
    00:05:54 [TRACE] hyper::proto::h1::role: [<unknown>:327] ClientTransaction::encode has_body=false, method=None
    00:05:54 [TRACE] hyper::proto::h1::io: [<unknown>:542] reclaiming write buf Vec
    00:05:54 [DEBUG] hyper::proto::h1::io: flushed 47 bytes
    00:05:54 [TRACE] hyper::proto::h1::conn: [<unknown>:580] flushed State { reading: Init, writing: KeepAlive, keep_alive: Busy, error: None, read_task: None }
    00:05:54 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 73367 }
    00:05:54 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 274, tv_nsec: 762090221 }
    00:05:54 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 7153 }
    00:05:55 [TRACE] tokio_reactor: [<unknown>:330] event Readable | Writable Token(5)
    00:05:55 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s
    00:05:55 [TRACE] hyper::proto::h1::dispatch: [<unknown>:274] Dispatcher::poll
    00:05:55 [TRACE] hyper::proto::h1::conn: [<unknown>:172] Conn::read_head
    00:05:55 [DEBUG] hyper::proto::h1::io: read 686 bytes
    00:05:55 [TRACE] hyper::proto::h1::role: [<unknown>:231] Response.parse([Header; 100], [u8; 686])
    00:05:55 [TRACE] hyper::proto::h1::role: [<unknown>:236] Response.parse Complete(268)
    00:05:55 [TRACE] hyper::header: [<unknown>:355] maybe_literal not found, copying "Keep-Alive"
    00:05:55 [TRACE] hyper::header: [<unknown>:355] maybe_literal not found, copying "Vary"
    00:05:55 [DEBUG] hyper::proto::h1::io: parsed 9 headers (268 bytes)
    00:05:55 [DEBUG] hyper::proto::h1::conn: incoming body is content-length (418 bytes)
    00:05:55 [TRACE] hyper::proto: [<unknown>:133] expecting_continue(version=Http11, header=None) = false
    00:05:55 [TRACE] hyper::proto: [<unknown>:122] should_keep_alive(version=Http11, header=Some(Connection([KeepAlive]))) = true
    00:05:55 [TRACE] hyper::proto::h1::conn: [<unknown>:246] Conn::read_body
    00:05:55 [TRACE] hyper::proto::h1::decode: [<unknown>:88] decode; state=Length(418)
    00:05:55 [TRACE] hyper::proto::h1::conn: [<unknown>:580] flushed State { reading: Body(Length(0)), writing: KeepAlive, keep_alive: Busy, error: None, read_task: None }
    00:05:55 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 41689020 }
    00:05:55 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 274, tv_nsec: 803790242 }
    00:05:55 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 7605 }
    00:05:55 [TRACE] hyper::proto::h1::dispatch: [<unknown>:274] Dispatcher::poll
    00:05:55 [TRACE] hyper::proto::h1::conn: [<unknown>:246] Conn::read_body
    00:05:55 [TRACE] hyper::proto::h1::decode: [<unknown>:88] decode; state=Length(0)
    00:05:55 [DEBUG] hyper::proto::h1::conn: incoming body completed
    00:05:55 [TRACE] hyper::client::pool: [<unknown>:332] pool dropped, dropping pooled ("http://apresolve.spotify.com")
    00:05:55 [TRACE] hyper::proto::h1::conn: [<unknown>:829] State::close()
    00:05:55 [TRACE] hyper::proto::h1::conn: [<unknown>:426] maybe_notify; no task to notify
    00:05:55 [TRACE] hyper::proto::h1::conn: [<unknown>:284] read_keep_alive; is_mid_message=true
    00:05:55 [TRACE] hyper::proto::h1::conn: [<unknown>:311] parking current task
    00:05:55 [TRACE] hyper::proto::h1::conn: [<unknown>:423] maybe_notify; notifying task
    00:05:55 [TRACE] hyper::proto::h1::conn: [<unknown>:580] flushed State { reading: Closed, writing: Closed, keep_alive: Disabled, error: None, read_task: Some(Task) }
    00:05:55 [TRACE] hyper::proto::h1::conn: [<unknown>:588] shut down IO
    00:05:55 [TRACE] hyper::proto::h1::dispatch: [<unknown>:74] Dispatch::poll done
    00:05:55 [TRACE] mio::poll: [<unknown>:905] deregistering handle with poller
    00:05:55 [DEBUG] tokio_reactor: dropping I/O source: 4
    00:05:55 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 99727 }
    00:05:55 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 274, tv_nsec: 803922570 }
    00:05:55 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 7494 }
    00:05:55 [INFO] Connecting to AP "gew1-accesspoint-b-ksm3.ap.spotify.com:4070"
    00:05:55 [TRACE] mio::poll: [<unknown>:785] registering with poller
    00:05:55 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 1442 }
    00:05:55 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 274, tv_nsec: 856588695 }
    00:05:55 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 10310 }
    00:05:55 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 451 }
    00:05:55 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 274, tv_nsec: 856603463 }
    00:05:55 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 6833 }
    00:05:55 [TRACE] tokio_reactor: [<unknown>:330] event Writable Token(5)
    00:05:55 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s
    00:05:55 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 40060141 }
    00:05:55 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 274, tv_nsec: 896673953 }
    00:05:55 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 11552 }
    00:05:55 [TRACE] tokio_reactor: [<unknown>:330] event Readable | Writable Token(5)
    00:05:55 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s
    00:05:55 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 38608403 }
    00:05:55 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 274, tv_nsec: 941009139 }
    00:05:55 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 12343 }
    00:05:55 [TRACE] tokio_io::framed_write: [<unknown>:182] flushing framed transport
    00:05:55 [TRACE] tokio_io::framed_write: [<unknown>:185] writing; remaining=128
    00:05:55 [TRACE] tokio_io::framed_write: [<unknown>:202] framed transport flushed
    00:05:56 [TRACE] tokio_reactor: [<unknown>:330] event Readable | Writable Token(5)
    00:05:56 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s
    00:05:56 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 969123604 }
    00:05:56 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 275, tv_nsec: 915854447 }
    00:05:56 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 23234 }
    00:05:56 [TRACE] tokio_io::framed_read: [<unknown>:189] attempting to decode a frame
    00:05:56 [TRACE] tokio_io::framed_read: [<unknown>:192] frame decoded from buffer
    00:05:56 [TRACE] mio::poll: [<unknown>:905] deregistering handle with poller
    00:05:56 [DEBUG] tokio_reactor: dropping I/O source: 4
    00:05:56 [TRACE] tokio_reactor: [<unknown>:330] event Readable Token(0)
    00:05:56 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s
    00:05:56 [DEBUG] tokio_reactor::background: shutting background reactor down NOW
    00:05:56 [DEBUG] tokio_reactor::background: background reactor has shutdown
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:871] shutdown; state=State { lifecycle: 0, num_futures: 0 }
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:917]   -> transitioned to shutdown
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:929]   -> shutting down workers
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:933]   -> shutdown worker; idx=7; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:948] signal_stop -- WORKER_SHUTDOWN; idx=7
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:983] worker_terminated; num_workers=7
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:933]   -> shutdown worker; idx=6; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:948] signal_stop -- WORKER_SHUTDOWN; idx=6
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:983] worker_terminated; num_workers=6
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:933]   -> shutdown worker; idx=5; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:948] signal_stop -- WORKER_SHUTDOWN; idx=5
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:983] worker_terminated; num_workers=5
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:933]   -> shutdown worker; idx=4; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:948] signal_stop -- WORKER_SHUTDOWN; idx=4
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:983] worker_terminated; num_workers=4
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:933]   -> shutdown worker; idx=3; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:948] signal_stop -- WORKER_SHUTDOWN; idx=3
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:983] worker_terminated; num_workers=3
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:933]   -> shutdown worker; idx=2; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:948] signal_stop -- WORKER_SHUTDOWN; idx=2
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:983] worker_terminated; num_workers=2
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:933]   -> shutdown worker; idx=1; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:948] signal_stop -- WORKER_SHUTDOWN; idx=1
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:983] worker_terminated; num_workers=1
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:933]   -> shutdown worker; idx=0; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:948] signal_stop -- WORKER_SHUTDOWN; idx=0
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:983] worker_terminated; num_workers=0
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:986] notifying shutdown task
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:851] Shutdown::poll
    00:05:56 [DEBUG] tokio_core::reactor::timeout_token: cancel timeout 0
    
    bug wontfix 
    Reply
  • Spotifyd login problem
    Spotifyd login problem

    Oct 12, 2020

    Sorry I normally fill out a bug report properly but it's late at night for me and I'm really tired plus I don't think this problem warrants a report. If it's not easily fixable the I'll document every detail.

    spotifyd 0.2.24 from community repo in Arch spotify-tui from AUR

    I've tried to get spotifyd to actually output information using the --verbose flag but that does nothing for me (literally outputs nothing at all).

    Now I'm not 100% sure this is the issue but I believe that the config file isn't using the username and password I've given it. I've tried wrapping the UN and PS in quotations and also tried substituting my UN for my email.

    I feel like whatever syntax it's looking for, I'm probably not using it. Please clear up this in the readme or at least the config file.

    Do you use the email for the UN field or the UN on the profile, do you need to wrap the UN and PS in quotations?

    Anyways back to my problem. I don't believe spotifyd is running properly because spotify-tui doesn't show my device in the list of devices even though I specifically set it in the spotifyd config.

    bug wontfix 
    Reply
  • Panic: zero-initalisation of ov_callbacks
    Panic: zero-initalisation of ov_callbacks

    Nov 20, 2020

    Description

    I had successfully compiled and used Spotifyd for several days prior to this issue. I then decided I wanted to enable additional feature flags, so uninstalled the original instance and built again.

    However, after installing this new instance, I receive the following log messages when trying to play music from a Connect-capable client:

    
    No proxy specified
    
    Using software volume controller.
    
    Connecting to AP "gew1-accesspoint-b-qzv4.ap.spotify.com:443"
    
    Authenticated as "USERNAME" !
    
    Country: "GB"
    
    Unhandled DBus message: (Signal, Some("/org/freedesktop/DBus"), Some("org.freedesktop.DBus"), Some("NameAcquired"))
    
    Unhandled DBus message: (Signal, Some("/org/freedesktop/DBus"), Some("org.freedesktop.DBus"), Some("NameAcquired"))
    
    Loading <Broken> with Spotify URI <spotify:track:6qF9QltwbDeujrVrzpCoLj>
    
    Caught panic with message: attempted to zero-initialize type `librespot_tremor::tremor_sys::ov_callbacks`, which is invalid
    
    Caught panic with message: called `Result::unwrap()` on an `Err` value: "SendError(..)"
    
    Player thread panicked!
    

    The application then exits.

    Upon realising this, I attempted to rebuild the source with the original options and reinstall that, however the same error displays as above. I tried a variety of build options and appropriate configuration tweaking to no end, including using the crates.io release to no avail.

    To Reproduce Not something I can describe to reproduce, as it occurred suddenly and is persistent among rebuilds.

    Logs

    Click to show logs Loading config from "/home/pi/.config/spotifyd/spotifyd.conf"

    CliConfig { config_path: None, no_daemon: true, verbose: true, pid: None, shared_config: SharedConfigValues { username: Some("taken out for privacy"), password: Some("taken out for privacy"), password_cmd: None, use_keyring: false, on_song_change_hook: None, cache_path: None, no-audio-cache: false, backend: Some(PulseAudio), volume_controller: None, device: None, control: None, mixer: None, device_name: Some("RaspberryPiTV"), bitrate: Some(Bitrate320), volume_normalisation: false, normalisation_pregain: None, zeroconf_port: None, proxy: None } }

    Found shell "/bin/bash" using SHELL environment variable.

    No proxy specified

    registering with poller

    registering with poller

    registering with poller

    registering with poller

    build; num-workers=4

    registering with poller

    Using software volume controller.

    starting background reactor

    adding I/O source: 0

    registering with poller

    Zeroconf server listening on 0.0.0.0:37183

    adding I/O source: 4194305

    registering with poller

    event Writable Token(4194305)

    adding I/O source: 8388610

    registering with poller

    loop process - 1 events, 0.000s

    event Writable Token(8388610)

    loop process - 1 events, 0.000s

    adding I/O source: 12582915

    registering with poller

    event Writable Token(12582915)

    loop process - 1 events, 0.000s

    park; waiting for idle connection: "http://apresolve.spotify.com"

    Http::connect("http://apresolve.spotify.com/")

    consuming notification queue

    resolving host="apresolve.spotify.com", port=80

    scheduling Read for: 0

    scheduling Read for: 1

    sending packet to 224.0.0.251:5353

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    scheduling Read for: 2

    sending packet to [ff02::fb]:5353

    scheduling Read for: 3

    loop poll - 1.90442ms

    loop time - Instant { tv_sec: 5973, tv_nsec: 505571470 }

    loop process, 41.351µs

    event Readable | Writable Token(8388610)

    loop process - 1 events, 0.000s

    received packet from 192.168.1.182:5353

    received packet from 192.168.1.182:5353 with no query

    scheduling Read for: 1

    scheduling Read for: 1

    received packet from [fe80::6c2e:d9c1:337:5394]:5353

    received packet from [fe80::6c2e:d9c1:337:5394]:5353 with no query

    scheduling Read for: 2

    scheduling Read for: 2

    loop poll - 326.755µs

    loop time - Instant { tv_sec: 5973, tv_nsec: 505963465 }

    loop process, 36.055µs

    scheduling Read for: 1

    scheduling Read for: 2

    loop poll - 45.666µs

    loop time - Instant { tv_sec: 5973, tv_nsec: 506065445 }

    loop process, 34µs

    loop poll - 128.534738ms

    loop time - Instant { tv_sec: 5973, tv_nsec: 634651312 }

    loop process, 56.574µs

    connecting to 34.98.74.57:80

    adding I/O source: 16777220

    registering with poller

    scheduling Write for: 4

    event Writable Token(16777220)

    loop process - 1 events, 0.000s

    loop poll - 18.134288ms

    loop time - Instant { tv_sec: 5973, tv_nsec: 653032245 }

    loop process, 22.926µs

    read_keep_alive; is_mid_message=false

    scheduling Read for: 4

    should_keep_alive(version=Http11, header=None) = true

    Client::encode has_body=false, method=None

    reclaiming write buf Vec

    flushed 47 bytes

    flushed State { reading: Init, writing: KeepAlive, keep_alive: Busy, error: None }

    wants_read_again? false

    loop poll - 199.664µs

    loop time - Instant { tv_sec: 5973, tv_nsec: 653334649 }

    loop process, 16.074µs

    event Readable | Writable Token(16777220)

    loop process - 1 events, 0.000s

    Conn::read_head

    read 594 bytes

    Response.parse([Header; 100], [u8; 594])

    Response.parse Complete(176)

    maybe_literal not found, copying "Via"

    parsed 5 headers (176 bytes)

    incoming body is content-length (418 bytes)

    expecting_continue(version=Http11, header=None) = false

    should_keep_alive(version=Http11, header=None) = true

    Conn::read_body

    decode; state=Length(418)

    flushed State { reading: Body(Length(0)), writing: KeepAlive, keep_alive: Busy, error: None }

    wants_read_again? false

    loop poll - 27.867997ms

    loop time - Instant { tv_sec: 5973, tv_nsec: 681227461 }

    loop process, 18.445µs

    Conn::read_body

    decode; state=Length(0)

    incoming body completed

    scheduling Read for: 4

    maybe_notify; read_from_io blocked

    read_keep_alive; is_mid_message=false

    scheduling Read for: 4

    signal: Want

    flushed State { reading: Init, writing: Init, keep_alive: Idle, error: None }

    wants_read_again? false

    poll_want: taker wants!

    pool dropped, dropping pooled ("http://apresolve.spotify.com")

    loop poll - 169.128µs

    loop time - Instant { tv_sec: 5973, tv_nsec: 681475440 }

    loop process, 17.796µs

    Connecting to AP "gew1-accesspoint-b-n2lk.ap.spotify.com:443"

    adding I/O source: 20971525

    registering with poller

    scheduling Write for: 5

    read_keep_alive; is_mid_message=false

    scheduling Read for: 4

    client tx closed

    State::close_read()

    State::close_write()

    flushed State { reading: Closed, writing: Closed, keep_alive: Disabled, error: None }

    wants_read_again? false

    shut down IO

    deregistering handle with poller

    dropping I/O source: 4

    signal: Closed

    loop poll - 152.628µs

    loop time - Instant { tv_sec: 5973, tv_nsec: 814064256 }

    loop process, 18.722µs

    event Writable Token(20971525)

    loop process - 1 events, 0.000s

    loop poll - 24.842684ms

    loop time - Instant { tv_sec: 5973, tv_nsec: 838934625 }

    loop process, 22.519µs

    scheduling Read for: 5

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    loop poll - 22.965875ms

    loop time - Instant { tv_sec: 5973, tv_nsec: 884802432 }

    loop process, 22.981µs

    flushing framed transport

    writing; remaining=133

    framed transport flushed

    scheduling Read for: 5

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    received packet from 192.168.1.237:5353

    scheduling Read for: 1

    scheduling Read for: 1

    scheduling Read for: 2

    loop poll - 179.811013ms

    loop time - Instant { tv_sec: 5974, tv_nsec: 87816409 }

    loop process, 22.759µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    loop poll - 619.45243ms

    loop time - Instant { tv_sec: 5974, tv_nsec: 707304894 }

    loop process, 119.943µs

    attempting to decode a frame

    frame decoded from buffer

    Authenticated as "REDACTED" !

    new Session[0]

    new Spirc[0]

    new MercuryManager

    input volume:65535 to mixer: 65535

    attempting to decode a frame

    frame decoded from buffer

    Session[0] strong=4 weak=2

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    frame decoded from buffer

    Country: "GB"

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    writing; remaining=876

    framed transport flushed

    loop poll - 562.548µs

    loop time - Instant { tv_sec: 5974, tv_nsec: 708687302 }

    loop process, 40.351µs

    new Player[0]

    Using PulseAudio sink

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 17.572444ms

    loop time - Instant { tv_sec: 5974, tv_nsec: 726319912 }

    loop process, 39.629µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 16.785861ms

    loop time - Instant { tv_sec: 5974, tv_nsec: 743167309 }

    loop process, 39.944µs

    subscribed uri=hm://remote/user/REDACTED/ count=0

    loop poll - 8.352µs

    loop time - Instant { tv_sec: 5974, tv_nsec: 743305474 }

    loop process, 38.499µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 2.744613ms

    loop time - Instant { tv_sec: 5974, tv_nsec: 746107697 }

    loop process, 37.463µs

    Modify_watch: Watch { fd: 23, read: true, write: false }, poll_now: false

    adding I/O source: 25165828

    registering with poller

    Dropping AConnection

    scheduling Read for: 4

    scheduling Read for: 4

    D-Bus i/o poll ready: 23 is NotReady

    handle_msgs: (Signal, Some("/org/freedesktop/DBus"), Some("org.freedesktop.DBus"), Some("NameAcquired"))

    handle_msgs: (Signal, Some("/org/freedesktop/DBus"), Some("org.freedesktop.DBus"), Some("NameAcquired"))

    loop poll - 216.682µs

    loop time - Instant { tv_sec: 5974, tv_nsec: 751050023 }

    loop process, 45.592µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 49.369448ms

    loop time - Instant { tv_sec: 5974, tv_nsec: 800487322 }

    loop process, 44.703µs

    Polling message stream

    msgstream found Ok(Ready(Some((Signal, Some("/org/freedesktop/DBus"), Some("org.freedesktop.DBus"), Some("NameAcquired")))))

    Unhandled DBus message: (Signal, Some("/org/freedesktop/DBus"), Some("org.freedesktop.DBus"), Some("NameAcquired"))

    Polling message stream

    msgstream found Ok(Ready(Some((Signal, Some("/org/freedesktop/DBus"), Some("org.freedesktop.DBus"), Some("NameAcquired")))))

    Unhandled DBus message: (Signal, Some("/org/freedesktop/DBus"), Some("org.freedesktop.DBus"), Some("NameAcquired"))

    Polling message stream

    msgstream found Ok(NotReady)

    kMessageTypeNotify "Nokia 5.3" a0f1362bbbc96bff82ba9a7ebcfc56ffcacfdeda 1698217680 1605868501989

    loop poll - 12.981µs

    loop time - Instant { tv_sec: 5974, tv_nsec: 800946334 }

    loop process, 48.759µs

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    received packet from 192.168.1.237:5353

    scheduling Read for: 1

    scheduling Read for: 1

    scheduling Read for: 2

    loop poll - 295.089105ms

    loop time - Instant { tv_sec: 5975, tv_nsec: 96105753 }

    loop process, 48.814µs

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    received packet from 192.168.1.237:5353

    scheduling Read for: 1

    scheduling Read for: 1

    scheduling Read for: 2

    loop poll - 4.213226017s

    loop time - Instant { tv_sec: 5979, tv_nsec: 309408065 }

    loop process, 21.741µs

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    received packet from 192.168.1.237:5353

    scheduling Read for: 1

    scheduling Read for: 1

    sending packet to 224.0.0.251:5353

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    scheduling Read for: 2

    loop poll - 196.614689ms

    loop time - Instant { tv_sec: 5979, tv_nsec: 506057828 }

    loop process, 22.703µs

    received packet from 192.168.1.182:5353

    received packet from 192.168.1.182:5353 with no query

    scheduling Read for: 1

    scheduling Read for: 1

    scheduling Read for: 2

    loop poll - 74.851µs

    loop time - Instant { tv_sec: 5979, tv_nsec: 506169549 }

    loop process, 17.518µs

    event Readable Token(0)

    loop process - 1 events, 0.000s

    scheduling Read for: 0

    loop poll - 66.485193ms

    loop time - Instant { tv_sec: 5979, tv_nsec: 572681649 }

    loop process, 20.259µs

    Conn::read_head

    adding I/O source: 29360134

    registering with poller

    scheduling Read for: 6

    event Writable Token(29360134)

    loop process - 1 events, 0.000s

    flushed State { reading: Init, writing: Init, keep_alive: Busy, error: None }

    wants_read_again? false

    loop poll - 107.906µs

    loop time - Instant { tv_sec: 5979, tv_nsec: 572822221 }

    loop process, 16.407µs

    event Readable | Writable Token(29360134)

    loop process - 1 events, 0.000s

    Conn::read_head

    read 222 bytes

    Request.parse([Header; 100], [u8; 222])

    Request.parse Complete(222)

    maybe_literal not found, copying "Keep-Alive"

    parsed 6 headers (222 bytes)

    incoming body is content-length (0 bytes)

    expecting_continue(version=Http11, header=None) = false

    should_keep_alive(version=Http11, header=Some(Connection([KeepAlive]))) = true

    read_keep_alive; is_mid_message=true

    should_keep_alive(version=Http11, header=None) = true

    Server::encode has_body=true, method=Some(Get)

    encoding chunked 450B

    flushed 546 bytes

    scheduling Read for: 6

    maybe_notify; read_from_io blocked

    flushed State { reading: Init, writing: Init, keep_alive: Idle, error: None }

    wants_read_again? false

    loop poll - 1.541295ms

    loop time - Instant { tv_sec: 5979, tv_nsec: 574388849 }

    loop process, 18.092µs

    event Readable | Writable Token(29360134)

    loop process - 1 events, 0.000s

    Conn::read_head

    read 0 bytes

    parse eof

    State::close_read()

    read eof

    read_keep_alive; is_mid_message=true

    flushed State { reading: Closed, writing: Init, keep_alive: Disabled, error: None }

    wants_read_again? false

    shut down IO

    deregistering handle with poller

    dropping I/O source: 6

    loop poll - 3.530881ms

    loop time - Instant { tv_sec: 5979, tv_nsec: 577948026 }

    loop process, 17.259µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 294.498761ms

    loop time - Instant { tv_sec: 5979, tv_nsec: 872473657 }

    loop process, 47.518µs

    Polling message stream

    msgstream found Ok(NotReady)

    kMessageTypeHello "Nokia 5.3" a0f1362bbbc96bff82ba9a7ebcfc56ffcacfdeda 1698222811 1605868507120

    scheduling Read for: 5

    flushing framed transport

    writing; remaining=393

    framed transport flushed

    loop poll - 212.275µs

    loop time - Instant { tv_sec: 5979, tv_nsec: 873014465 }

    loop process, 43.981µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 27.351412ms

    loop time - Instant { tv_sec: 5979, tv_nsec: 900432468 }

    loop process, 46.426µs

    Polling message stream

    msgstream found Ok(NotReady)

    loop poll - 12.148µs

    loop time - Instant { tv_sec: 5979, tv_nsec: 900612003 }

    loop process, 49.203µs

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    received packet from 192.168.1.237:5353

    scheduling Read for: 1

    scheduling Read for: 1

    scheduling Read for: 2

    loop poll - 407.842414ms

    loop time - Instant { tv_sec: 5980, tv_nsec: 308527657 }

    loop process, 47.777µs

    event Readable Token(0)

    loop process - 1 events, 0.000s

    scheduling Read for: 0

    loop poll - 220.500978ms

    loop time - Instant { tv_sec: 5980, tv_nsec: 529104893 }

    loop process, 47.907µs

    Conn::read_head

    adding I/O source: 33554438

    registering with poller

    scheduling Read for: 6

    event Writable Token(33554438)

    loop process - 1 events, 0.000s

    flushed State { reading: Init, writing: Init, keep_alive: Busy, error: None }

    wants_read_again? false

    loop poll - 268.849µs

    loop time - Instant { tv_sec: 5980, tv_nsec: 529448093 }

    loop process, 58.462µs

    event Readable | Writable Token(33554438)

    loop process - 1 events, 0.000s

    Conn::read_head

    read 222 bytes

    Request.parse([Header; 100], [u8; 222])

    Request.parse Complete(222)

    maybe_literal not found, copying "Keep-Alive"

    parsed 6 headers (222 bytes)

    incoming body is content-length (0 bytes)

    expecting_continue(version=Http11, header=None) = false

    should_keep_alive(version=Http11, header=Some(Connection([KeepAlive]))) = true

    read_keep_alive; is_mid_message=true

    should_keep_alive(version=Http11, header=None) = true

    Server::encode has_body=true, method=Some(Get)

    encoding chunked 450B

    flushed 546 bytes

    scheduling Read for: 6

    maybe_notify; read_from_io blocked

    flushed State { reading: Init, writing: Init, keep_alive: Idle, error: None }

    wants_read_again? false

    loop poll - 847.934µs

    loop time - Instant { tv_sec: 5980, tv_nsec: 530378970 }

    loop process, 43.407µs

    event Readable | Writable Token(33554438)

    loop process - 1 events, 0.000s

    Conn::read_head

    read 0 bytes

    parse eof

    State::close_read()

    read eof

    read_keep_alive; is_mid_message=true

    flushed State { reading: Closed, writing: Init, keep_alive: Disabled, error: None }

    wants_read_again? false

    shut down IO

    deregistering handle with poller

    dropping I/O source: 6

    loop poll - 7.834475ms

    loop time - Instant { tv_sec: 5980, tv_nsec: 538281777 }

    loop process, 47.296µs

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    received packet from 192.168.1.237:5353

    scheduling Read for: 1

    scheduling Read for: 1

    sending packet to 224.0.0.251:5353

    scheduling Read for: 2

    loop poll - 77.209149ms

    loop time - Instant { tv_sec: 5980, tv_nsec: 615564184 }

    loop process, 45.148µs

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    received packet from 192.168.1.182:5353

    received packet from 192.168.1.182:5353 with no query

    scheduling Read for: 1

    scheduling Read for: 1

    scheduling Read for: 2

    loop poll - 318.811µs

    loop time - Instant { tv_sec: 5980, tv_nsec: 615953865 }

    loop process, 39.129µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 227.069635ms

    loop time - Instant { tv_sec: 5980, tv_nsec: 843083388 }

    loop process, 46.537µs

    Polling message stream

    msgstream found Ok(NotReady)

    kMessageTypeNotify "Nokia 5.3" a0f1362bbbc96bff82ba9a7ebcfc56ffcacfdeda 1698223783 1605868508092

    loop poll - 10.407µs

    loop time - Instant { tv_sec: 5980, tv_nsec: 843346959 }

    loop process, 44.536µs

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    received packet from 192.168.1.237:5353

    scheduling Read for: 1

    scheduling Read for: 1

    scheduling Read for: 2

    loop poll - 478.45311ms

    loop time - Instant { tv_sec: 5981, tv_nsec: 321866568 }

    loop process, 41.704µs

    event Readable Token(0)

    loop process - 1 events, 0.000s

    scheduling Read for: 0

    loop poll - 938.291809ms

    loop time - Instant { tv_sec: 5982, tv_nsec: 260224080 }

    loop process, 48.315µs

    Conn::read_head

    adding I/O source: 37748742

    registering with poller

    event Readable | Writable Token(37748742)

    loop process - 1 events, 0.000s

    read 222 bytes

    Request.parse([Header; 100], [u8; 222])

    Request.parse Complete(222)

    maybe_literal not found, copying "Keep-Alive"

    parsed 6 headers (222 bytes)

    incoming body is content-length (0 bytes)

    expecting_continue(version=Http11, header=None) = false

    should_keep_alive(version=Http11, header=Some(Connection([KeepAlive]))) = true

    read_keep_alive; is_mid_message=true

    should_keep_alive(version=Http11, header=None) = true

    Server::encode has_body=true, method=Some(Get)

    encoding chunked 450B

    flushed 546 bytes

    scheduling Read for: 6

    maybe_notify; read_from_io blocked

    flushed State { reading: Init, writing: Init, keep_alive: Idle, error: None }

    wants_read_again? false

    loop poll - 843.86µs

    loop time - Instant { tv_sec: 5982, tv_nsec: 261142976 }

    loop process, 42.648µs

    event Readable | Writable Token(37748742)

    loop process - 1 events, 0.000s

    Conn::read_head

    read 0 bytes

    parse eof

    State::close_read()

    read eof

    read_keep_alive; is_mid_message=true

    flushed State { reading: Closed, writing: Init, keep_alive: Disabled, error: None }

    wants_read_again? false

    shut down IO

    deregistering handle with poller

    dropping I/O source: 6

    loop poll - 2.008289ms

    loop time - Instant { tv_sec: 5982, tv_nsec: 263217857 }

    loop process, 43.518µs

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    received packet from 192.168.1.237:5353

    scheduling Read for: 1

    scheduling Read for: 1

    sending packet to 224.0.0.251:5353

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    scheduling Read for: 2

    loop poll - 93.033412ms

    loop time - Instant { tv_sec: 5982, tv_nsec: 356318842 }

    loop process, 49.073µs

    received packet from 192.168.1.182:5353

    received packet from 192.168.1.182:5353 with no query

    scheduling Read for: 1

    scheduling Read for: 1

    scheduling Read for: 2

    loop poll - 221.33µs

    loop time - Instant { tv_sec: 5982, tv_nsec: 356617671 }

    loop process, 44.907µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 259.871256ms

    loop time - Instant { tv_sec: 5982, tv_nsec: 616556834 }

    loop process, 49.444µs

    Polling message stream

    msgstream found Ok(NotReady)

    kMessageTypeNotify "Nokia 5.3" a0f1362bbbc96bff82ba9a7ebcfc56ffcacfdeda 1698225544 1605868509853

    loop poll - 12.573µs

    loop time - Instant { tv_sec: 5982, tv_nsec: 616906329 }

    loop process, 53.907µs

    event Readable Token(0)

    loop process - 1 events, 0.000s

    scheduling Read for: 0

    loop poll - 543.555099ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 160539298 }

    loop process, 43.388µs

    Conn::read_head

    adding I/O source: 41943046

    registering with poller

    event Readable | Writable Token(41943046)

    loop process - 1 events, 0.000s

    read 222 bytes

    Request.parse([Header; 100], [u8; 222])

    Request.parse Complete(222)

    maybe_literal not found, copying "Keep-Alive"

    parsed 6 headers (222 bytes)

    incoming body is content-length (0 bytes)

    expecting_continue(version=Http11, header=None) = false

    should_keep_alive(version=Http11, header=Some(Connection([KeepAlive]))) = true

    read_keep_alive; is_mid_message=true

    should_keep_alive(version=Http11, header=None) = true

    Server::encode has_body=true, method=Some(Get)

    encoding chunked 450B

    flushed 546 bytes

    scheduling Read for: 6

    maybe_notify; read_from_io blocked

    flushed State { reading: Init, writing: Init, keep_alive: Idle, error: None }

    wants_read_again? false

    loop poll - 787.379µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 161394954 }

    loop process, 38.092µs

    event Readable | Writable Token(41943046)

    loop process - 1 events, 0.000s

    Conn::read_head

    read 0 bytes

    parse eof

    State::close_read()

    read eof

    read_keep_alive; is_mid_message=true

    flushed State { reading: Closed, writing: Init, keep_alive: Disabled, error: None }

    wants_read_again? false

    shut down IO

    deregistering handle with poller

    dropping I/O source: 6

    loop poll - 4.593349ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 166047450 }

    loop process, 43.944µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 41.998152ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 208114046 }

    loop process, 41.425µs

    Polling message stream

    msgstream found Ok(NotReady)

    kMessageTypeLoad "Nokia 5.3" a0f1362bbbc96bff82ba9a7ebcfc56ffcacfdeda 1698226110 1605868509853

    State: context_uri: "spotify:album:69fOwmdCZIaWPE4OLLnuQi" index: 2 position_ms: 7039 status: kPlayStatusPause position_measured_at: 1605868510465 context_description: "" shuffle: false repeat: false playing_from_fallback: true row: 0 playing_track_index: 2 track {gid: "\031\327?\022\[email protected]\305\246*\272\372\235\n\334%"} track {gid: "#(\261\262\022\310G\235\267.O\203\333v8\366"} track {gid: "\323E\211\022\224\232NA\257o\303\371X\236-\025"} track {gid: "2|8p\322^Kl\203\321Loz\351\363e"} track {gid: "oUlG\312\005HF\210\266s!\3548\210m"} track {gid: "\350\312\377(\233|E.\252x\302k*\256\326A"} track {gid: "%\002\306[\262\023D\313\253\302V\213-Y\000\265"} track {gid: "_\372\34678\377MR\254\034\321\333\3013\036e"} track {gid: "?\357\312\321\336NK\022\242\343\226c\301~#\203"} track {gid: "\200\022\345\251\2455C[\246\245\271\352\206\031\2504"} track {gid: "\211\320!W/\H\221\203hN\376<6\361\220"} track {gid: "\275\261\002\316\377DI\231\221\252l\032Q\267l\205"} track {gid: "\201\345-\2178BG\213\255\306\231\211\322&"} track {gid: "\344<7!t\036C\005\227:y\242G\037\302\373"} track {gid: "\310\252\t\254\343AM\336\253\230\206JRa\310~"} track {gid: "$\327*\310\237cK\340\256\231\353\215\215\365\200\"} track {gid: "\252\365\n\227r\023O\272\251\023\234\352H\251\335\345"} track {gid: "\306\017\265\0252<EA\270\300\254#\316\306\[email protected]"} track {gid: "@ty\372\201\251G\377\220\213\r\221\302\333\202\331"}

    Frame has 19 tracks

    Loading context: spotify:album:69fOwmdCZIaWPE4OLLnuQi index: [2] of 19

    command=Load(SpotifyId { id: 280828155756069103410723341265459358997, audio_type: Track }, false, 7039)

    scheduling Read for: 5

    flushing framed transport

    writing; remaining=899

    framed transport flushed

    loop poll - 198.812µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 209566175 }

    loop process, 40.203µs

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    received packet from 192.168.1.237:5353

    scheduling Read for: 1

    scheduling Read for: 1

    sending packet to 224.0.0.251:5353

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    scheduling Read for: 2

    loop poll - 9.944059ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 219570696 }

    loop process, 44.148µs

    received packet from 192.168.1.182:5353

    received packet from 192.168.1.182:5353 with no query

    scheduling Read for: 1

    scheduling Read for: 1

    scheduling Read for: 2

    loop poll - 167.257µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 219807786 }

    loop process, 40.777µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 13.378034ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 233247819 }

    loop process, 46.814µs

    Polling message stream

    msgstream found Ok(NotReady)

    Loading with Spotify URI spotify:track:6qF9QltwbDeujrVrzpCoLj

    new AudioKeyManager

    loop poll - 12µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 233424817 }

    loop process, 141.387µs

    Downloading file 18e55e1787e41646e9d2725de2f0c9648c046ee0

    requesting chunk 0

    new ChannelManager

    scheduling Read for: 5

    flushing framed transport

    writing; remaining=102

    framed transport flushed

    loop poll - 345.07µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 233940218 }

    loop process, 47.036µs

    consuming notification queue

    loop poll - 105.147µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 234118882 }

    loop process, 52.203µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 24.65365ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 258847753 }

    loop process, 44.685µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 13.338442ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 272257268 }

    loop process, 47.703µs

    consuming notification queue

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    loop poll - 384.828µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 272715392 }

    loop process, 45.499µs

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 115.499µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 272900223 }

    loop process, 41.037µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 5.391635ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 278354765 }

    loop process, 41.777µs

    loop poll - 46.869µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 278465763 }

    loop process, 42.815µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 12.544063ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 291074196 }

    loop process, 49.166µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    loop poll - 322.348µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 291470821 }

    loop process, 78.906µs

    attempting to decode a frame

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 1.434075ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 293007468 }

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    loop process, 318.922µs

    attempting to decode a frame

    attempting to decode a frame

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 921.878µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 294275989 }

    loop process, 43.758µs

    loop poll - 80.61µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 294424172 }

    loop process, 43.222µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 829.805µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 295319124 }

    loop process, 41.61µs

    loop poll - 44.759µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 295427863 }

    loop process, 42.129µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 3.165756ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 298657118 }

    loop process, 41.277µs

    loop poll - 46.221µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 298766783 }

    loop process, 42.204µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 16.197906ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 315028244 }

    loop process, 44.962µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 5.874888ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 321299090 }

    loop process, 48.573µs

    loop poll - 537.234µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 321912767 }

    loop process, 44.129µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 15.654931ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 337634031 }

    loop process, 39.814µs

    chunk 0 / 26 complete

    requesting chunk 1

    loop poll - 146.72µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 337842824 }

    loop process, 39.426µs

    scheduling Read for: 5

    Normalisation Data: NormalisationData { track_gain_db: -6.9300003, track_peak: 0.9973827, album_gain_db: -10.010002, album_peak: 1.0153364 }

    Applied normalisation factor: 0.45029798

    flushing framed transport

    writing; remaining=53

    Caught panic with message: attempted to zero-initialize type librespot_tremor::tremor_sys::ov_callbacks, which is invalid

    framed transport flushed

    loop poll - 289.238µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 338198487 }

    loop process, 50.258µs

    loop poll - 16.778µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 338295245 }

    loop process, 49.036µs

    drop Player[0]

    loop poll - 474.254µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 338848775 }

    loop process, 44.647µs

    Polling message stream

    msgstream found Ok(NotReady)

    loop poll - 14.74µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 339007976 }

    loop process, 38.537µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 33.415502ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 372480922 }

    loop process, 41.962µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 18.377471ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 390924947 }

    loop process, 41.703µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 17.24143ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 408231820 }

    loop process, 39.666µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 57.642046ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 465937365 }

    loop process, 46.647µs

    Polling message stream

    msgstream found Ok(NotReady)

    kMessageTypeNotify "Nokia 5.3" a0f1362bbbc96bff82ba9a7ebcfc56ffcacfdeda 1698226403 1605868510712

    loop poll - 12.5µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 466218380 }

    loop process, 45.999µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 1.208130976s

    loop time - Instant { tv_sec: 5984, tv_nsec: 674418207 }

    loop process, 41.963µs

    Polling message stream

    msgstream found Ok(NotReady)

    kMessageTypePlay "Nokia 5.3" a0f1362bbbc96bff82ba9a7ebcfc56ffcacfdeda 1698227616 1605868510712

    Caught panic with message: called Result::unwrap() on an Err value: "SendError(..)"

    drop Spirc[0]

    Shutting down player thread ...

    Player thread panicked!

    drop Session[0]

    drop AudioKeyManager

    drop ChannelManager

    drop MercuryManager

    Dropping AMessageStream

    AMessageStream telling ADriver to quit

    shutdown; state=pool::State { lifecycle: Running, num_futures: 0 }

    -> transitioned to shutdown

    -> shutting down workers

    dropping I/O source: 3

    dropping I/O source: 4

    deregistering handle with poller

    dropping I/O source: 0

    deregistering handle with poller

    dropping I/O source: 1

    deregistering handle with poller

    dropping I/O source: 2

    drop Dispatch

    deregistering handle with poller

    dropping I/O source: 5

    Compilation flags

    • [x] dbus_mpris
    • [x] dbus_keyring
    • [ ] alsa_backend
    • [ ] portaudio_backend
    • [x] pulseaudio_backend
    • [ ] rodio_backend

    Note that this occurs regardless of features enabled; the working version originally only had alsa_backend enabled, and this no longer works.

    Versions (please complete the following information):

    • OS: Raspian Buster (armhf) on Kernel 5.4.75-v7l+
    • Spotifyd: v0.2.24, both crates.io release and 39106ed8ed270247b1203cc2eed5b05121c90cf7
    • cargo: cargo 1.48.0 (65cbdd2dc 2020-10-14)
    bug blocked by: librespot reproducibility: easy 
    Reply
  • Web API related MPRIS commands stop working after one hour
    Web API related MPRIS commands stop working after one hour

    Sep 27, 2019

    Still using the old 0.2.10 version with MPRIS. There is a process polling information on several MPRIS services (not just spotifyd) that polls every few second. Changing the polling interval doesn't seem to have an effect on the time that the MPRIS interface stops working. It seems that spotifyd MPRIS is responsive only for about 1 hours. Changing the polling interval doesn't change anything, it seems that the problem always happens after one hour.

    bug help wanted wontfix 
    Reply
  • meta: adding contributors to file
    meta: adding contributors to file

    Feb 3, 2021

                                                                                                                                                                                                           
    Reply