E2E Testing with WebdriverIO
End-to-end tests for JellyTau using WebdriverIO and tauri-driver. These tests run against a real Tauri app instance with an isolated test database.
Quick Start
# 1. Configure test credentials (first time only)
cp e2e/.env.example e2e/.env
# Edit e2e/.env with your Jellyfin server details
# 2. Build the frontend
bun run build
# 3. Run E2E tests
bun run test:e2e
Configuration
Test Credentials
E2E tests use credentials from e2e/.env (gitignored). Copy the example file to get started:
cp e2e/.env.example e2e/.env
e2e/.env (your private file):
# Your Jellyfin test server
TEST_SERVER_URL=https://your-jellyfin.example.com
TEST_SERVER_NAME=My Test Server
# Test user credentials
TEST_USERNAME=testuser
TEST_PASSWORD=yourpassword
# Optional: Specific test data IDs
TEST_MUSIC_LIBRARY_ID=abc123
TEST_ALBUM_ID=xyz789
# ... etc
Important:
- ✅
.envis gitignored - your credentials stay private - ✅ Tests fall back to Jellyfin demo server if
.envdoesn't exist - ✅ Share
.env.examplewith your team so they can set up their own
Isolated Test Database
Your production data is safe! E2E tests use a completely separate database:
- Production:
~/.local/share/com.dtourolle.jellytau/- Your real data ✅ - E2E Tests:
/tmp/jellytau-test-data/- Isolated test data ✅
This is configured via the JELLYTAU_DATA_DIR environment variable in wdio.conf.ts.
Architecture
Test Structure
e2e/
├── .env.example # Template for test credentials
├── .env # Your credentials (gitignored)
├── specs/ # Test specifications
│ ├── app-launch.e2e.ts # App initialization tests
│ ├── auth.e2e.ts # Authentication flow
│ └── navigation.e2e.ts # Navigation and routing
├── pageobjects/ # Page Object Model (POM)
│ ├── BasePage.ts # Base class with common methods
│ ├── LoginPage.ts # Login page interactions
│ └── HomePage.ts # Home page interactions
└── helpers/ # Test utilities
├── testConfig.ts # Load .env configuration
└── testSetup.ts # Setup helpers
Page Object Model
Tests use the Page Object Model pattern for maintainability:
// Good: Using page objects
import LoginPage from "../pageobjects/LoginPage";
await LoginPage.waitForLoginPage();
await LoginPage.connectToServer(testConfig.serverUrl);
await LoginPage.login(testConfig.username, testConfig.password);
// Bad: Direct selectors in tests
await $("#server-url").setValue("https://...");
await $("button").click();
Writing Tests
Using Test Configuration
Always use testConfig for credentials and server details:
import { testConfig } from "../helpers/testConfig";
describe("My Feature", () => {
it("should test something", async () => {
// Use testConfig instead of hardcoded values
await LoginPage.connectToServer(testConfig.serverUrl);
await LoginPage.login(testConfig.username, testConfig.password);
// Access optional test data
if (testConfig.albumId) {
// Test with specific album
}
});
});
Test Data IDs
For tests that need specific content (albums, tracks, etc.):
- Find the ID in your Jellyfin server (check the URL when viewing an item)
- Add it to your
e2e/.env:TEST_ALBUM_ID=abc123def456 - Use it in tests:
if (testConfig.albumId) { await browser.url(`/album/${testConfig.albumId}`); }
Example Test
import { expect } from "@wdio/globals";
import LoginPage from "../pageobjects/LoginPage";
import { testConfig } from "../helpers/testConfig";
describe("Album Playback", () => {
beforeEach(async () => {
// Login before each test
await LoginPage.waitForLoginPage();
await LoginPage.fullLoginFlow(
testConfig.serverUrl,
testConfig.username,
testConfig.password
);
});
it("should play an album", async () => {
// Skip if no test album configured
if (!testConfig.albumId) {
console.log("Skipping - no TEST_ALBUM_ID configured");
return;
}
// Navigate to album
await browser.url(`/album/${testConfig.albumId}`);
// Click play
const playButton = await $('[aria-label="Play"]');
await playButton.click();
// Verify playback started
const miniPlayer = await $(".mini-player");
expect(await miniPlayer.isDisplayed()).toBe(true);
});
});
Running Tests
Commands
# Run all E2E tests
bun run test:e2e
# Run in watch mode (development)
bun run test:e2e:dev
# Run specific test file
bun run test:e2e -- e2e/specs/auth.e2e.ts
Before Running
Always build the frontend first:
bun run build
cd src-tauri && cargo build
The debug binary expects built frontend files in the build/ directory.
Test Files
app-launch.e2e.ts
Basic app initialization tests:
- App launches successfully
- UI renders correctly
- Unauthenticated users redirect to login
Status: ✅ Working (no credentials needed)
auth.e2e.ts
Full authentication flow:
- Server connection (2-step process)
- Login form validation
- Error handling
- Complete auth flow
Status: ✅ Working with any Jellyfin server
navigation.e2e.ts
Routing and navigation:
- Protected routes
- Redirects
- Navigation after login
Status: ⚠️ Needs valid credentials (configure .env)
Configuration Reference
wdio.conf.ts
Main WebdriverIO configuration:
{
port: 4444, // tauri-driver port
maxInstances: 1, // Run tests sequentially
logLevel: "warn", // Reduce noise
framework: "mocha",
timeout: 60000, // 60s test timeout
capabilities: [{
"tauri:options": {
application: "path/to/app",
env: {
JELLYTAU_DATA_DIR: "/tmp/jellytau-test-data" // Isolated DB
}
}
}]
}
Environment Variables
| Variable | Description | Default |
|---|---|---|
TEST_SERVER_URL |
Jellyfin server URL | https://demo.jellyfin.org/stable |
TEST_SERVER_NAME |
Server display name | Demo Server |
TEST_USERNAME |
Test user username | demo |
TEST_PASSWORD |
Test user password | `` (empty) |
TEST_MUSIC_LIBRARY_ID |
Music library ID | undefined |
TEST_ALBUM_ID |
Album ID for playback tests | undefined |
TEST_TRACK_ID |
Track ID for tests | undefined |
TEST_TIMEOUT |
Mocha test timeout (ms) | 60000 |
TEST_WAIT_TIMEOUT |
Element wait timeout (ms) | 15000 |
Debugging
View Application During Tests
Tests run with a visible window. To pause and inspect:
it("debug test", async () => {
await LoginPage.waitForLoginPage();
// Pause for 10 seconds to inspect
await browser.pause(10000);
await LoginPage.enterServerUrl(testConfig.serverUrl);
});
Check Logs
- WebdriverIO logs: Console output (set
logLevel: "info"in config) - tauri-driver logs: Stdout/stderr from driver process
- App logs: Check app console (if running with dev tools)
Common Issues
"Connection refused" in browser body
- Frontend not built: Run
bun run build - Solution: Always build before testing
"Element not found" errors
- Selector might be wrong
- Element not loaded yet - add wait:
await element.waitForDisplayed()
"Invalid session id"
- Normal when app closes between tests
- Each test file gets a fresh app instance
Tests fail with "no .env file"
- Copy
e2e/.env.exampletoe2e/.env - Configure your Jellyfin server details
Database still using production data
- Check
wdio.conf.tshasJELLYTAU_DATA_DIRenv var - Rebuild app:
cd src-tauri && cargo build
Platform Support
Supported
- ✅ Linux - Primary development platform
- ✅ Windows - Supported (paths auto-detected)
- ✅ macOS - Supported (paths auto-detected)
Not Supported
- ❌ Android - E2E testing requires Appium + emulators (out of scope)
- Desktop tests cover 90% of app logic anyway
Team Collaboration
Sharing Test Configuration
DO:
- ✅ Commit
e2e/.env.examplewith template values - ✅ Update README when adding new test data requirements
- ✅ Use descriptive variable names in
.env.example
DON'T:
- ❌ Commit
e2e/.envwith real credentials - ❌ Hardcode server URLs in test files
- ❌ Skip authentication in tests (always test full flows)
Setting Up for a New Team Member
- Clone repo
- Copy env template:
cp e2e/.env.example e2e/.env - Configure credentials: Edit
e2e/.envwith your Jellyfin server - Build frontend:
bun run build - Run tests:
bun run test:e2e
That's it! No shared credentials needed.
Best Practices
- Use testConfig: Never hardcode credentials
- Use Page Objects: Keep selectors out of test specs
- Wait for Elements: Always use
.waitForDisplayed() - Independent Tests: Each test should work standalone
- Skip Gracefully: Check for optional test data before using
- Build First: Always
bun run buildbefore running tests - Clear Names: Use descriptive
describeanditblocks
Future Enhancements
- Add more page objects (Player, Library, Queue, Settings)
- Create test data fixtures
- Add visual regression testing
- Mock Jellyfin API for faster, more reliable tests
- CI/CD integration (GitHub Actions)
- Test report generation
- Screenshot capture on failure
- Video recording of test runs