Compare commits

...

108 Commits
v1.7.1 ... dev

Author SHA1 Message Date
Félix MARQUET
202f2c979f Merge pull request #38 from BreizhHardware/dependabot/cargo/dev/serde_json-1.0.142
build(deps): bump serde_json from 1.0.140 to 1.0.142
2025-08-12 14:42:07 +02:00
dependabot[bot]
3d8b5bd726 build(deps): bump serde_json from 1.0.140 to 1.0.142
Bumps [serde_json](https://github.com/serde-rs/json) from 1.0.140 to 1.0.142.
- [Release notes](https://github.com/serde-rs/json/releases)
- [Commits](https://github.com/serde-rs/json/compare/v1.0.140...v1.0.142)

---
updated-dependencies:
- dependency-name: serde_json
  dependency-version: 1.0.142
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-08-11 13:54:27 +00:00
Félix MARQUET
905c352eff Merge pull request #37 from BreizhHardware/dependabot/cargo/dev/tokio-1.47.1
build(deps): bump tokio from 1.45.1 to 1.47.1
2025-08-11 15:53:23 +02:00
Félix MARQUET
1c847be019 Merge pull request #34 from BreizhHardware/dependabot/cargo/dev/rand-0.9.2
build(deps): bump rand from 0.9.1 to 0.9.2
2025-08-11 15:53:09 +02:00
Félix MARQUET
b0e8c0ff39 Merge pull request #33 from BreizhHardware/dependabot/cargo/dev/rusqlite-0.37.0
build(deps): bump rusqlite from 0.36.0 to 0.37.0
2025-08-11 15:52:58 +02:00
Félix MARQUET
2af1314e4d Merge pull request #31 from BreizhHardware/dependabot/cargo/dev/reqwest-0.12.22
build(deps): bump reqwest from 0.12.20 to 0.12.22
2025-08-11 14:52:30 +02:00
dependabot[bot]
4019137602 build(deps): bump rand from 0.9.1 to 0.9.2
Bumps [rand](https://github.com/rust-random/rand) from 0.9.1 to 0.9.2.
- [Release notes](https://github.com/rust-random/rand/releases)
- [Changelog](https://github.com/rust-random/rand/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rust-random/rand/compare/rand_core-0.9.1...rand_core-0.9.2)

---
updated-dependencies:
- dependency-name: rand
  dependency-version: 0.9.2
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-08-11 10:37:18 +00:00
dependabot[bot]
ed066c5d5b build(deps): bump reqwest from 0.12.20 to 0.12.22
Bumps [reqwest](https://github.com/seanmonstar/reqwest) from 0.12.20 to 0.12.22.
- [Release notes](https://github.com/seanmonstar/reqwest/releases)
- [Changelog](https://github.com/seanmonstar/reqwest/blob/master/CHANGELOG.md)
- [Commits](https://github.com/seanmonstar/reqwest/compare/v0.12.20...v0.12.22)

---
updated-dependencies:
- dependency-name: reqwest
  dependency-version: 0.12.22
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-08-11 10:35:45 +00:00
dependabot[bot]
a9923624a1 build(deps): bump rusqlite from 0.36.0 to 0.37.0
Bumps [rusqlite](https://github.com/rusqlite/rusqlite) from 0.36.0 to 0.37.0.
- [Release notes](https://github.com/rusqlite/rusqlite/releases)
- [Changelog](https://github.com/rusqlite/rusqlite/blob/master/Changelog.md)
- [Commits](https://github.com/rusqlite/rusqlite/compare/v0.36.0...v0.37.0)

---
updated-dependencies:
- dependency-name: rusqlite
  dependency-version: 0.37.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-08-11 10:35:15 +00:00
dependabot[bot]
eb9fa5954f build(deps): bump tokio from 1.45.1 to 1.47.1
Bumps [tokio](https://github.com/tokio-rs/tokio) from 1.45.1 to 1.47.1.
- [Release notes](https://github.com/tokio-rs/tokio/releases)
- [Commits](https://github.com/tokio-rs/tokio/compare/tokio-1.45.1...tokio-1.47.1)

---
updated-dependencies:
- dependency-name: tokio
  dependency-version: 1.47.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-08-11 10:34:56 +00:00
Félix MARQUET
f7ca4622b6 Merge pull request #30 from BreizhHardware/fix/config-reset
fix(main): Update environment variable handling and API startup logic
2025-08-10 11:45:35 +02:00
Félix MARQUET
1ffa17d82e fix(api, docker, github, models): Suppress unused variable warnings by adding #[allow(dead_code)] 2025-08-10 11:36:28 +02:00
Félix MARQUET
bd36cf5ad9 fix(config): Implement selective update of app settings from environment variables 2025-08-10 11:21:15 +02:00
Félix MARQUET
d6712b738b Update src/main.rs
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-07-04 16:03:02 +02:00
Félix MARQUET
e0d8b4636e Update src/main.rs
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-07-04 16:02:56 +02:00
Félix MARQUET
36f366c1c8 Update src/main.rs
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-07-04 16:02:50 +02:00
Félix MARQUET
2a035e1ea1 fix(main): Update environment variable handling and API startup logic 2025-07-04 15:59:56 +02:00
Félix MARQUET
9ef9179995 Merge pull request #29 from BreizhHardware/dependabot/cargo/dev/rand-0.9.1
build(deps): bump rand from 0.8.5 to 0.9.1
2025-07-01 12:41:15 +02:00
dependabot[bot]
f510233c55 build(deps): bump rand from 0.8.5 to 0.9.1
Bumps [rand](https://github.com/rust-random/rand) from 0.8.5 to 0.9.1.
- [Release notes](https://github.com/rust-random/rand/releases)
- [Changelog](https://github.com/rust-random/rand/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rust-random/rand/compare/0.8.5...rand_core-0.9.1)

---
updated-dependencies:
- dependency-name: rand
  dependency-version: 0.9.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-07-01 09:57:25 +00:00
Félix MARQUET
2f033a6460 Merge pull request #28 from BreizhHardware/dependabot/cargo/dev/bcrypt-0.17.0
build(deps): bump bcrypt from 0.15.1 to 0.17.0
2025-07-01 11:56:18 +02:00
dependabot[bot]
5bac3d5bca build(deps): bump bcrypt from 0.15.1 to 0.17.0
Bumps [bcrypt](https://github.com/Keats/rust-bcrypt) from 0.15.1 to 0.17.0.
- [Commits](https://github.com/Keats/rust-bcrypt/compare/v0.15.1...v0.17.0)

---
updated-dependencies:
- dependency-name: bcrypt
  dependency-version: 0.17.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-30 03:16:36 +00:00
Félix MARQUET
00f0e8363d Merge pull request #27 from BreizhHardware/fix/405-app-repo
fix(nginx): Add app_repo to API routes configuration
2025-06-25 13:32:27 +02:00
Félix MARQUET
be4c88c315 fix(api): Update fetch endpoint for GitHub repository in GithubRepoSection.vue 2025-06-25 13:23:15 +02:00
Félix MARQUET
b2a03226f2 fix(nginx): Add app_repo to API routes configuration 2025-06-25 12:52:30 +02:00
Félix MARQUET
b8228ffe33 Merge pull request #26 from BreizhHardware/feat/onboarding
feat(auth): Implement user authentication with login, registration, and session management
2025-06-24 14:36:40 +02:00
Félix MARQUET
18b9eb25dc fix(database): Update error handling for password hashing in create_user function 2025-06-24 14:31:35 +02:00
Félix MARQUET
fd132cf7d8 Update src/database.rs
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-06-24 14:24:42 +02:00
Félix MARQUET
be9d7299cd Update entrypoint.sh
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-06-24 14:24:12 +02:00
Félix MARQUET
c2a86cb9d4 feat(onboarding): Update onboarding UI and enhance CI/CD workflows [bump-minor] 2025-06-24 14:16:54 +02:00
Félix MARQUET
288192bd29 feat(onboarding): Remove admin onboarding button and streamline onboarding logic for existing admin check 2025-06-24 14:06:10 +02:00
Félix MARQUET
bfc0c34029 feat(onboarding): Add NTFY username and password fields to settings and update auth handling 2025-06-24 13:56:37 +02:00
Félix MARQUET
31d3c34697 feat(onboarding): Add NTFY username and password fields to onboarding and update auth handling 2025-06-24 13:43:00 +02:00
Félix MARQUET
fde4574b76 feat(onboarding): Adjust onboarding steps for admin account creation and notification service selection 2025-06-24 13:19:46 +02:00
Félix MARQUET
bf8239097c feat(onboarding): Add admin onboarding button and enhance onboarding logic for admin users 2025-06-24 13:05:08 +02:00
Félix MARQUET
844880d1fe feat(onboarding): Implement admin account creation step in onboarding process 2025-06-23 16:57:10 +02:00
Félix MARQUET
c6945a6948 feat(onboarding): Enhance registration process with admin approval and update AppHeader for authenticated users 2025-06-23 16:36:16 +02:00
Félix MARQUET
edcde5bb52 feat(onboarding): Add authentication middleware for route protection 2025-06-23 16:08:29 +02:00
Félix MARQUET
1e6e119116 feat(onboarding): Add authentication middleware for route protection 2025-06-23 15:55:01 +02:00
Félix MARQUET
f844365b9c feat(onboarding): Move LatestUpdates and repository sections to a new index.vue component 2025-06-23 15:46:31 +02:00
Félix MARQUET
3f069f8c15 feat(nginx): Improve server configuration for static files and API routing 2025-06-23 15:26:15 +02:00
Félix MARQUET
7c8b04808e feat(nginx): Enhance static file handling and consolidate API route configurations 2025-06-23 15:15:33 +02:00
Félix MARQUET
af15ab974d refactor(api, database, main, ntfy): Simplify database lock handling and clean up unused imports 2025-06-23 15:05:19 +02:00
Félix MARQUET
8ca81b2ed3 feat(api): Add GitHub repository endpoint and implement authorization header handling 2025-06-23 15:04:52 +02:00
Félix MARQUET
e022b7ac2d Update Dockerfile
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-06-23 14:54:13 +02:00
Félix MARQUET
c060604c21 Update entrypoint.sh
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-06-23 14:54:04 +02:00
Félix MARQUET
11e33961dc Update src/models.rs
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-06-23 14:53:49 +02:00
Félix MARQUET
afe436f66b feat(auth): Implement user authentication with login, registration, and session management 2025-06-23 14:45:38 +02:00
Félix MARQUET
72d82148ab Merge pull request #25 from BreizhHardware/fix/nuxt-md
fix(latest-updates): Enhance changelog display with toggle functional…
2025-06-23 14:16:22 +02:00
Félix MARQUET
5267d6126c fix(latest-updates): Enhance changelog display with toggle functionality in LatestUpdates.vue 2025-06-23 14:15:30 +02:00
Félix MARQUET
2f39c7a910 Merge pull request #24 from BreizhHardware/fix/nuxt-md
Fix/nuxt md
2025-06-23 14:15:12 +02:00
Félix MARQUET
42ead2733c fix(latest-updates): Enhance changelog display with toggle functionality in LatestUpdates.vue 2025-06-23 14:05:24 +02:00
Félix MARQUET
c1d8002cd5 fix(latest-updates): Simplify update label format in LatestUpdates.vue 2025-06-23 13:40:50 +02:00
Félix MARQUET
f7130b3411 feat(ci): Add GitHub Actions workflow for building and pushing Docker PR images 2025-06-23 13:31:51 +02:00
Félix MARQUET
ee9669f154 fix(latest-updates): Remove unused comments and styles in LatestUpdates.vue 2025-06-23 13:29:16 +02:00
Félix MARQUET
42f5ac0133 feat(latest-updates): Render changelogs using marked and update styles 2025-06-23 13:27:25 +02:00
Félix MARQUET
94c9da0eff refactor(nginx): Update entrypoint and nginx configuration for static Nuxt serving 2025-06-23 13:15:27 +02:00
Félix MARQUET
417a4e7eb5 fix(ci): Update output directory paths in create_dev.yml 2025-06-23 13:04:48 +02:00
Félix MARQUET
3b95306974 fix(ci): Update artifact upload path to target public directory 2025-06-22 13:22:40 +02:00
Félix MARQUET
29fa37f2f2 fix(ci): Update paths for Docker build and artifact upload 2025-06-22 13:17:46 +02:00
Félix MARQUET
34866799a2 refactor(nuxt): Remove server-side rendering and generation exclusions 2025-06-20 13:44:14 +02:00
Félix MARQUET
2d4488bd83 feat(ci): Add check for output directory existence in create_dev.yml 2025-06-20 13:43:05 +02:00
Félix MARQUET
3807b73a09 Merge pull request #23 from BreizhHardware/refactor/web-nuxt
refactor(web): refactor the web interface using Nuxt and NuxtUI
2025-06-20 13:35:57 +02:00
Félix MARQUET
c01603f16f fix(api): Fix missing import 2025-06-20 13:35:24 +02:00
Félix MARQUET
5e3af6f49a Update src/api.rs
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-06-20 13:32:37 +02:00
Félix MARQUET
82c613f1d3 Update Dockerfile
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-06-20 13:30:13 +02:00
Félix MARQUET
869f22a9d1 Update src/api.rs
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-06-20 13:30:06 +02:00
Félix MARQUET
ebe8853240 Update web/components/LatestUpdates.vue
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-06-20 13:29:59 +02:00
Félix MARQUET
4c696bfb60 Update src/api.rs
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-06-20 13:29:17 +02:00
Félix MARQUET
a1faa3ed8b Update web/components/LatestUpdates.vue
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-06-20 13:25:33 +02:00
Félix MARQUET
bdffae83fa refactor(web): refactor the web interface using Nuxt and NuxtUI 2025-06-20 13:22:10 +02:00
Félix MARQUET
bf35608f71 Update .github/workflows/dependabot-build.yml
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-06-16 10:32:22 +02:00
Félix MARQUET
b842d104c7 Merge pull request #20 from BreizhHardware/dependabot/cargo/dev/reqwest-0.12.20
build(deps): bump reqwest from 0.11.27 to 0.12.20
2025-06-16 10:08:33 +02:00
dependabot[bot]
b22351e77e build(deps): bump reqwest from 0.11.27 to 0.12.20
Bumps [reqwest](https://github.com/seanmonstar/reqwest) from 0.11.27 to 0.12.20.
- [Release notes](https://github.com/seanmonstar/reqwest/releases)
- [Changelog](https://github.com/seanmonstar/reqwest/blob/master/CHANGELOG.md)
- [Commits](https://github.com/seanmonstar/reqwest/compare/v0.11.27...v0.12.20)

---
updated-dependencies:
- dependency-name: reqwest
  dependency-version: 0.12.20
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-16 08:04:33 +00:00
Félix MARQUET
10d8a23897 Merge pull request #21 from BreizhHardware/dependabot/cargo/dev/rusqlite-0.36.0
build(deps): bump rusqlite from 0.29.0 to 0.36.0
2025-06-16 10:03:22 +02:00
Félix MARQUET
4e54b557b0 Merge pull request #19 from BreizhHardware/dependabot/cargo/dev/env_logger-0.11.8
build(deps): bump env_logger from 0.10.2 to 0.11.8
2025-06-16 10:01:33 +02:00
dependabot[bot]
a92caf5e37 build(deps): bump rusqlite from 0.29.0 to 0.36.0
Bumps [rusqlite](https://github.com/rusqlite/rusqlite) from 0.29.0 to 0.36.0.
- [Release notes](https://github.com/rusqlite/rusqlite/releases)
- [Changelog](https://github.com/rusqlite/rusqlite/blob/master/Changelog.md)
- [Commits](https://github.com/rusqlite/rusqlite/compare/v0.29.0...v0.36.0)

---
updated-dependencies:
- dependency-name: rusqlite
  dependency-version: 0.36.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-16 07:59:11 +00:00
dependabot[bot]
f2a6d4f0de build(deps): bump env_logger from 0.10.2 to 0.11.8
Bumps [env_logger](https://github.com/rust-cli/env_logger) from 0.10.2 to 0.11.8.
- [Release notes](https://github.com/rust-cli/env_logger/releases)
- [Changelog](https://github.com/rust-cli/env_logger/blob/main/CHANGELOG.md)
- [Commits](https://github.com/rust-cli/env_logger/compare/v0.10.2...v0.11.8)

---
updated-dependencies:
- dependency-name: env_logger
  dependency-version: 0.11.8
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-16 07:56:05 +00:00
Félix MARQUET
08bf34104a update(create_dev): add condition to skip jobs for Dependabot pull requests 2025-06-16 09:54:24 +02:00
Félix MARQUET
ff3e00eb4e add(dependabot): create build check workflow for Dependabot pull requests 2025-06-16 09:51:13 +02:00
Félix MARQUET
f52f505e38 update(README): simplify Docker image description and remove unused DB_PATH entry 2025-06-16 09:28:45 +02:00
Félix MARQUET
82f5f59413 update(create_release): modify workflow name and add dev tag for Docker image 2025-06-16 09:18:06 +02:00
Félix MARQUET
0a5945e7b3 Update src/database.rs
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-06-15 17:53:47 +02:00
Félix MARQUET
e4d2bc303f Update src/config.rs
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-06-15 17:51:39 +02:00
34729a7edd fix(api): update default database path to '/github-ntfy' 2025-06-15 17:43:36 +02:00
Félix MARQUET
21b51766bb Update src/api.rs
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-06-15 17:43:00 +02:00
Félix MARQUET
6a0031ac5d Update src/database.rs
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-06-15 17:42:47 +02:00
Félix MARQUET
79e48391cb refactor(database): add version tracking functions and update notification logic 2025-06-14 13:58:36 +02:00
Félix MARQUET
43275d1fd9 refactor(github): enhance error handling and add User-Agent header in get_latest_releases function 2025-06-14 11:45:13 +02:00
Félix MARQUET
fe33377fa0 refactor(ci): simplify CI configuration by consolidating binary build steps and updating Dockerfile 2025-06-13 13:55:10 +02:00
Félix MARQUET
60db3550c0 refactor(docker): update Dockerfile for architecture-specific binary handling and add OpenSSL dependency 2025-06-13 08:50:42 +02:00
2b9eb94337 refactor(ci): remove dynamic version tag from Docker image in create_dev.yml 2025-06-12 21:54:33 +02:00
acbd6ccc00 refactor(ci): update Docker image tags to use breizhhardware namespace 2025-06-12 21:44:39 +02:00
8c97043b2f refactor(rust): remove support for armv7 architecture in CI configuration 2025-06-12 21:39:19 +02:00
38918f0bb8 refactor(rust): simplify CI dependencies by removing unnecessary job requirement 2025-06-12 21:33:46 +02:00
622e3d4334 refactor(rust): update CI configuration for multi-architecture Docker builds 2025-06-12 21:32:20 +02:00
Félix MARQUET
b28f70b659 refactor(rust): add support for vendored OpenSSL in CI configuration 2025-06-12 20:23:06 +02:00
Félix MARQUET
5caa2b56ce refactor(rust): update CI configuration to support static OpenSSL with cross 2025-06-12 20:18:30 +02:00
Félix MARQUET
4ffa83efb4 Merge pull request #17 from BreizhHardware/refactor/rust-implementation
refactor(rust): upgrade GitHub Actions to use version 4 of upload and…
2025-06-12 20:10:13 +02:00
Félix MARQUET
39f0d6aa8b refactor(rust): upgrade GitHub Actions to use version 4 of upload and download artifacts 2025-06-12 20:09:10 +02:00
Félix MARQUET
856811a446 Merge pull request #16 from BreizhHardware/refactor/rust-implementation
Refactor/rust implementation
2025-06-12 20:06:36 +02:00
Félix MARQUET
57ea0ef54b refactor(rust): update notification functions and improve database handling 2025-06-12 20:05:44 +02:00
Félix MARQUET
cc39b743e6 refactor(rust): restructure project and update configuration management 2025-06-12 19:55:09 +02:00
Félix MARQUET
426403ad92 refactor(rust): Rewrite everything in rust 2025-06-12 19:41:10 +02:00
Félix MARQUET
d2ba0e510a refactor(rust): Rewrite everything in rust 2025-06-12 19:40:54 +02:00
Félix MARQUET
1430d39b5c fix(dependabot): ensure updates target the dev branch 2025-06-10 11:46:40 +02:00
Félix MARQUET
47fa8f820e Merge pull request #10 from BreizhHardware/dev
Create dependabot.yaml
2025-06-10 11:31:06 +02:00
Félix MARQUET
56439d8c62 Update dependabot.yaml 2025-06-10 11:30:56 +02:00
Félix MARQUET
013c5bd70d Create dependabot.yaml 2025-06-10 11:29:08 +02:00
62 changed files with 15779 additions and 1442 deletions

30
.github/dependabot.yaml vendored Normal file
View File

@@ -0,0 +1,30 @@
# To get started with Dependabot version updates, you'll need to specify which
# package ecosystems to update and where the package manifests are located.
# Please see the documentation for all configuration options:
# https://docs.github.com/code-security/dependabot/dependabot-version-updates/configuration-options-for-the-dependabot.yml-file
version: 2
updates:
- package-ecosystem: "cargo" # See documentation for possible values
directory: "/" # Location of package manifests
schedule:
interval: "weekly"
target-branch: "dev"
- package-ecosystem: "docker"
directory: "/"
schedule:
interval: "weekly"
target-branch: "dev"
- package-ecosystem: "github-actions"
directory: "/.github/workflows"
schedule:
interval: "weekly"
target-branch: "dev"
- package-ecosystem: "npm"
directory: "/web"
schedule:
interval: "weekly"
target-branch: "dev"

141
.github/workflows/build_pr.yml vendored Normal file
View File

@@ -0,0 +1,141 @@
name: Build et Push Docker PR Image
on:
pull_request:
types: [opened, synchronize, reopened]
jobs:
build-binary:
if: ${{ github.actor != 'dependabot[bot]' && !startsWith(github.ref, 'refs/heads/dependabot/') }}
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Installer Rust
uses: actions-rs/toolchain@v1
with:
toolchain: stable
target: x86_64-unknown-linux-musl
override: true
- name: Installer cross
run: cargo install cross
- name: Créer Cross.toml pour spécifier OpenSSL vendored
run: |
cat > Cross.toml << 'EOF'
[build.env]
passthrough = [
"RUSTFLAGS",
"OPENSSL_STATIC",
"OPENSSL_NO_VENDOR"
]
EOF
- name: Construire avec cross et OpenSSL vendored
env:
OPENSSL_STATIC: 1
RUSTFLAGS: "-C target-feature=+crt-static"
OPENSSL_NO_VENDOR: 0
run: |
cross build --release --target x86_64-unknown-linux-musl --features vendored-openssl
- name: Préparer le binaire
run: |
mkdir -p release
cp target/x86_64-unknown-linux-musl/release/github-ntfy release/github-ntfy
- name: Upload binaire comme artifact
uses: actions/upload-artifact@v4
with:
name: github-ntfy-pr
path: release/github-ntfy
build-frontend:
if: ${{ github.actor != 'dependabot[bot]' && !startsWith(github.ref, 'refs/heads/dependabot/') }}
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Setup PNPM
uses: pnpm/action-setup@v2
with:
version: '10.x'
run_install: false
- name: Build Frontend (Nuxt)
run: |
cd web
pnpm install
pnpm generate
- name: Vérifier le contenu du répertoire output
run: |
ls -la web/.output/public || echo "Le répertoire .output n'existe pas!"
- name: Upload frontend comme artifact
uses: actions/upload-artifact@v4
with:
name: nuxt-frontend-pr
path: web/.output/public
docker-build-push:
if: ${{ github.actor != 'dependabot[bot]' && !startsWith(github.ref, 'refs/heads/dependabot/') }}
needs: [build-binary, build-frontend]
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Configurer Docker
uses: docker/setup-buildx-action@v3
- name: Login Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Extraire le numéro de PR
run: |
PR_NUMBER=$(echo $GITHUB_REF | awk 'BEGIN { FS = "/" } ; { print $3 }')
echo "PR_NUMBER=$PR_NUMBER" >> $GITHUB_ENV
- name: Télécharger l'exécutable binaire
uses: actions/download-artifact@v4
with:
name: github-ntfy-pr
path: binaries
- name: Télécharger le frontend
uses: actions/download-artifact@v4
with:
name: nuxt-frontend-pr
path: web/.output/public
- name: Préparer les fichiers pour Docker
run: |
chmod +x binaries/github-ntfy
mkdir -p docker-build
cp binaries/github-ntfy docker-build/
mkdir -p docker-build/web-output/public
cp -r web/.output/public/* docker-build/web-output/public/
cp nginx.conf docker-build/
cp entrypoint.sh docker-build/
cp Dockerfile docker-build/
chmod +x docker-build/entrypoint.sh
- name: Construire et pousser l'image Docker
uses: docker/build-push-action@v6
with:
context: docker-build
push: true
tags: ${{ secrets.DOCKER_USERNAME }}/github-ntfy:pr-${{ env.PR_NUMBER }}
file: docker-build/Dockerfile

137
.github/workflows/create_dev.yml vendored Normal file
View File

@@ -0,0 +1,137 @@
name: Build et Push Docker Dev Image
on:
push:
branches:
- dev
jobs:
build-binary:
if: ${{ github.actor != 'dependabot[bot]' && !startsWith(github.ref, 'refs/heads/dependabot/') }}
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Installer Rust
uses: actions-rs/toolchain@v1
with:
toolchain: stable
target: x86_64-unknown-linux-musl
override: true
- name: Installer cross
run: cargo install cross
- name: Créer Cross.toml pour spécifier OpenSSL vendored
run: |
cat > Cross.toml << 'EOF'
[build.env]
passthrough = [
"RUSTFLAGS",
"OPENSSL_STATIC",
"OPENSSL_NO_VENDOR"
]
EOF
- name: Construire avec cross et OpenSSL vendored
env:
OPENSSL_STATIC: 1
RUSTFLAGS: "-C target-feature=+crt-static"
OPENSSL_NO_VENDOR: 0
run: |
cross build --release --target x86_64-unknown-linux-musl --features vendored-openssl
- name: Préparer le binaire
run: |
mkdir -p release
cp target/x86_64-unknown-linux-musl/release/github-ntfy release/github-ntfy
- name: Upload binaire comme artifact
uses: actions/upload-artifact@v4
with:
name: github-ntfy
path: release/github-ntfy
build-frontend:
if: ${{ github.actor != 'dependabot[bot]' && !startsWith(github.ref, 'refs/heads/dependabot/') }}
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Setup PNPM
uses: pnpm/action-setup@v2
with:
version: '10.x'
run_install: false
- name: Build Frontend (Nuxt)
run: |
cd web
pnpm install
pnpm generate
- name: Vérifier le contenu du répertoire output/public
run: |
ls -la web/.output/public || echo "Le répertoire .output/public n'existe pas!"
- name: Upload frontend comme artifact
uses: actions/upload-artifact@v4
with:
name: nuxt-frontend
path: web/.output/public # Cibler spécifiquement le répertoire public
docker-build-push:
if: ${{ github.actor != 'dependabot[bot]' && !startsWith(github.ref, 'refs/heads/dependabot/') }}
needs: [build-binary, build-frontend]
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Configurer Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Login Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Télécharger le binaire
uses: actions/download-artifact@v4
with:
name: github-ntfy
path: binaries
- name: Télécharger le frontend
uses: actions/download-artifact@v4
with:
name: nuxt-frontend
path: web/.output/public
- name: Préparer les fichiers pour Docker
run: |
chmod +x binaries/github-ntfy
mkdir -p docker-build
cp binaries/github-ntfy docker-build/
mkdir -p docker-build/web-output/public
cp -r web/.output/public/* docker-build/web-output/public/
cp nginx.conf docker-build/
cp entrypoint.sh docker-build/
cp Dockerfile docker-build/
chmod +x docker-build/entrypoint.sh
- name: Construire et pousser l'image Docker
uses: docker/build-push-action@v6
with:
context: docker-build
push: true
tags: ${{ secrets.DOCKER_USERNAME }}/github-ntfy:dev
file: docker-build/Dockerfile

View File

@@ -1,4 +1,4 @@
name: Docker Build and Release
name: Build et Release
on:
push:
@@ -6,25 +6,230 @@ on:
- main
jobs:
build-and-push-on-docker-hub:
version:
runs-on: ubuntu-latest
outputs:
version: ${{ steps.version.outputs.tag }}
steps:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Calculer la prochaine version
id: version
run: |
# Récupérer la dernière version ou utiliser v0.1.0 si aucune n'existe
LATEST_TAG=$(git describe --tags --abbrev=0 2>/dev/null || echo "v0.1.0")
echo "Dernière version: $LATEST_TAG"
# Extraire les composants de version
VERSION=${LATEST_TAG#v}
MAJOR=$(echo $VERSION | cut -d. -f1)
MINOR=$(echo $VERSION | cut -d. -f2)
PATCH=$(echo $VERSION | cut -d. -f3)
# Récupérer le dernier message de commit
COMMIT_MSG=$(git log -1 --pretty=%B)
# Déterminer quel niveau de version doit être incrémenté
if echo "$COMMIT_MSG" | grep -q "\[bump-major\]"; then
echo "Incrémentation de la version majeure détectée dans le message de commit"
MAJOR=$((MAJOR + 1))
MINOR=0
PATCH=0
elif echo "$COMMIT_MSG" | grep -q "\[bump-minor\]"; then
echo "Incrémentation de la version mineure détectée dans le message de commit"
MINOR=$((MINOR + 1))
PATCH=0
elif echo "$COMMIT_MSG" | grep -q "\[version="; then
# Format personnalisé: [version=X.Y.Z]
CUSTOM_VERSION=$(echo "$COMMIT_MSG" | grep -o '\[version=[0-9]*\.[0-9]*\.[0-9]*\]' | sed 's/\[version=\(.*\)\]/\1/')
if [ ! -z "$CUSTOM_VERSION" ]; then
echo "Version personnalisée détectée: $CUSTOM_VERSION"
MAJOR=$(echo $CUSTOM_VERSION | cut -d. -f1)
MINOR=$(echo $CUSTOM_VERSION | cut -d. -f2)
PATCH=$(echo $CUSTOM_VERSION | cut -d. -f3)
else
# Incrémentation de patch par défaut
PATCH=$((PATCH + 1))
fi
else
# Incrémentation de patch par défaut
PATCH=$((PATCH + 1))
fi
# Nouvelle version
NEW_VERSION="v$MAJOR.$MINOR.$PATCH"
echo "Nouvelle version: $NEW_VERSION"
echo "tag=$NEW_VERSION" >> $GITHUB_OUTPUT
build-binary:
needs: version
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Set up Docker Buildx
- name: Installer Rust
uses: actions-rs/toolchain@v1
with:
toolchain: stable
target: x86_64-unknown-linux-musl
override: true
- name: Installer cross
run: cargo install cross
- name: Créer Cross.toml pour spécifier OpenSSL vendored
run: |
cat > Cross.toml << 'EOF'
[build.env]
passthrough = [
"RUSTFLAGS",
"OPENSSL_STATIC",
"OPENSSL_NO_VENDOR"
]
EOF
- name: Construire avec cross et OpenSSL vendored
env:
OPENSSL_STATIC: 1
RUSTFLAGS: "-C target-feature=+crt-static"
OPENSSL_NO_VENDOR: 0
run: |
cross build --release --target x86_64-unknown-linux-musl --features vendored-openssl
- name: Préparer le binaire
run: |
mkdir -p release
cp target/x86_64-unknown-linux-musl/release/github-ntfy release/github-ntfy
- name: Upload binaire comme artifact
uses: actions/upload-artifact@v4
with:
name: github-ntfy
path: release/github-ntfy
build-frontend:
needs: version
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Setup PNPM
uses: pnpm/action-setup@v2
with:
version: '10.x'
run_install: false
- name: Build Frontend (Nuxt)
run: |
cd web
pnpm install
pnpm generate
- name: Vérifier le contenu du répertoire output
run: |
ls -la web/.output/public || echo "Le répertoire .output n'existe pas!"
- name: Upload frontend comme artifact
uses: actions/upload-artifact@v4
with:
name: nuxt-frontend
path: web/.output/public
docker-build-push:
needs: [version, build-binary, build-frontend]
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Configurer Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Log in to Docker Hub
- name: Login Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Build and push Docker image
- name: Télécharger le binaire
uses: actions/download-artifact@v4
with:
name: github-ntfy
path: binaries
- name: Télécharger le frontend
uses: actions/download-artifact@v4
with:
name: nuxt-frontend
path: web/.output/public
- name: Préparer les fichiers pour Docker
run: |
chmod +x binaries/github-ntfy
mkdir -p docker-build
cp binaries/github-ntfy docker-build/
mkdir -p docker-build/web-output/public
cp -r web/.output/public/* docker-build/web-output/public/
cp nginx.conf docker-build/
cp entrypoint.sh docker-build/
cp Dockerfile docker-build/
chmod +x docker-build/entrypoint.sh
- name: Construire et pousser l'image Docker
uses: docker/build-push-action@v6
with:
context: .
context: docker-build
push: true
tags: ${{ secrets.DOCKER_USERNAME }}/github-ntfy:latest
tags: |
breizhhardware/github-ntfy:latest
breizhhardware/github-ntfy:${{ needs.version.outputs.version }}
file: docker-build/Dockerfile
create-release:
needs: [version, build-binary, build-frontend]
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Télécharger le binaire
uses: actions/download-artifact@v4
with:
name: github-ntfy
path: binaries
- name: Télécharger le frontend
uses: actions/download-artifact@v4
with:
name: nuxt-frontend
path: web/.output/public
- name: Préparer les fichiers pour la release
run: |
mkdir -p release-artifacts
cp binaries/github-ntfy release-artifacts/
tar -czf release-artifacts/frontend.tar.gz -C web/.output/public .
- name: Créer une release GitHub
uses: softprops/action-gh-release@v1
with:
tag_name: ${{ needs.version.outputs.version }}
name: Release ${{ needs.version.outputs.version }}
files: |
release-artifacts/github-ntfy
release-artifacts/frontend.tar.gz
draft: false
prerelease: false
generate_release_notes: true
env:
GITHUB_TOKEN: ${{ secrets.TOKEN }}

View File

@@ -1,73 +0,0 @@
name: Docker Build and Release
on:
push:
branches:
- main
jobs:
build-and-push-on-docker-hub:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Log in to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Build and push Docker image
uses: docker/build-push-action@v6
with:
context: .
push: true
tags: ${{ secrets.DOCKER_USERNAME }}/github-ntfy:latest
release-on-github:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Get the latest tag
id: get_latest_tag
run: echo "latest_tag=$(git describe --tags `git rev-list --tags --max-count=1`)" >> $GITHUB_ENV
- name: Increment version
id: increment_version
run: |
latest_tag=${{ env.latest_tag }}
if [ -z "$latest_tag" ]; then
new_version="v1.5.2"
else
IFS='.' read -r -a version_parts <<< "${latest_tag#v}"
new_version="v${version_parts[0]}.$((version_parts[1] + 1)).0"
fi
echo "new_version=$new_version" >> $GITHUB_ENV
- name: Read changelog
id: read_changelog
run: echo "changelog=$(base64 -w 0 CHANGELOG.md)" >> $GITHUB_ENV
- name: Decode changelog
id: decode_changelog
run: echo "${{ env.changelog }}" | base64 -d > decoded_changelog.txt
- name: Create Release
id: create_release
uses: actions/create-release@v1
env:
GITHUB_TOKEN: ${{ secrets.TOKEN }}
with:
tag_name: ${{ env.new_version }}
release_name: Release ${{ env.new_version }}
body: ${{ steps.decode_changelog.outputs.changelog }}
draft: false
prerelease: false

View File

@@ -1,38 +0,0 @@
name: Docker Build and Release for arm64
on:
push:
branches:
- main
jobs:
build-and-push-on-docker-hub:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
with:
platforms: arm64
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
install: true
- name: Log in to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Build and push Docker image
uses: docker/build-push-action@v6
with:
context: .
push: true
platforms: linux/arm64
tags: ${{ secrets.DOCKER_USERNAME }}/github-ntfy:arm64

View File

@@ -1,38 +0,0 @@
name: Docker Build and Release for armv7
on:
push:
branches:
- main
jobs:
build-and-push-on-docker-hub:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
with:
platforms: arm/v7
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
install: true
- name: Log in to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Build and push Docker image
uses: docker/build-push-action@v6
with:
context: .
push: true
platforms: linux/arm/v7
tags: ${{ secrets.DOCKER_USERNAME }}/github-ntfy:armv7

92
.github/workflows/dependabot-build.yml vendored Normal file
View File

@@ -0,0 +1,92 @@
name: Dependabot Build
on:
pull_request:
branches: [ 'main', 'dev' ]
paths:
- '**/Cargo.toml'
- '**/Cargo.lock'
- 'web/package.json'
- 'web/pnpm-lock.yaml'
jobs:
build-binary:
if: ${{ startsWith(github.ref, 'refs/heads/dependabot/') || github.actor == 'dependabot[bot]' }}
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Installer Rust
uses: actions-rs/toolchain@v1
with:
toolchain: stable
target: x86_64-unknown-linux-musl
override: true
- name: Installer cross
run: cargo install cross
- name: Créer Cross.toml pour spécifier OpenSSL vendored
run: |
cat > Cross.toml << 'EOF'
[build.env]
passthrough = [
"RUSTFLAGS",
"OPENSSL_STATIC",
"OPENSSL_NO_VENDOR"
]
EOF
- name: Construire avec cross et OpenSSL vendored
env:
OPENSSL_STATIC: 1
RUSTFLAGS: "-C target-feature=+crt-static"
OPENSSL_NO_VENDOR: 0
run: |
cross build --release --target x86_64-unknown-linux-musl --features vendored-openssl
- name: Préparer le binaire
run: |
mkdir -p release
cp target/x86_64-unknown-linux-musl/release/github-ntfy release/github-ntfy
- name: Upload binaire comme artifact
uses: actions/upload-artifact@v4
with:
name: github-ntfy-dependabot
path: release/github-ntfy
build-frontend:
if: ${{ github.actor == 'dependabot[bot]' || startsWith(github.ref, 'refs/heads/dependabot/') }}
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Setup PNPM
uses: pnpm/action-setup@v2
with:
version: '10.x'
run_install: false
- name: Build Frontend (Nuxt)
run: |
cd web
pnpm install
pnpm generate
- name: Vérifier le contenu du répertoire output
run: |
ls -la web/.output/public || echo "Le répertoire .output n'existe pas!"
- name: Upload frontend comme artifact
uses: actions/upload-artifact@v4
with:
name: nuxt-frontend-dependabot
path: web/.output/public

10
.gitignore vendored
View File

@@ -405,4 +405,12 @@ docker-compose.yml
github-ntfy/
github-ntfy/*
*.db
*.db
# Rust
target
target/*
binaries
binaries/*

2443
Cargo.lock generated Normal file

File diff suppressed because it is too large Load Diff

26
Cargo.toml Normal file
View File

@@ -0,0 +1,26 @@
[package]
name = "github-ntfy"
version = "2.0.0"
edition = "2021"
[[bin]]
name = "github-ntfy"
path = "src/main.rs"
[features]
vendored-openssl = ["openssl/vendored"]
[dependencies]
tokio = { version = "1", features = ["full"] }
reqwest = { version = "0.12", features = ["json", "blocking"] }
rusqlite = { version = "0.37", features = ["bundled"] }
serde = { version = "1", features = ["derive"] }
serde_json = "1"
log = "0.4"
env_logger = "0.11"
dotenv = "0.15"
chrono = { version = "0.4", features = ["serde"] }
warp = "0.3"
openssl = { version = "0.10", features = ["vendored"] }
rand = "0.9"
bcrypt = "0.17"

View File

@@ -1,39 +1,32 @@
FROM python:3.11.8-alpine3.19
FROM alpine:3.22
LABEL maintainer="BreizhHardware"
LABEL version_number="1.4"
# Copier le binaire
COPY github-ntfy /usr/local/bin/github-ntfy
ADD ntfy.py /
ADD ntfy_api.py /
ADD requirements.txt /
ADD entrypoint.sh /
ADD send_ntfy.py /
ADD send_gotify.py /
ADD send_discord.py /
ADD send_slack.py /
ADD index.html /var/www/html/index.html
ADD script.js /var/www/html/script.js
RUN apk add --no-cache sqlite-dev sqlite-libs musl-dev nginx gcc
RUN pip install -r requirements.txt
RUN chmod 700 /entrypoint.sh
# Installer les dépendances
RUN apk add --no-cache sqlite-libs openssl nginx nodejs npm && \
chmod +x /usr/local/bin/github-ntfy
# Définir les variables d'environnement pour username et password
ENV USERNAME="" \
PASSWORD="" \
NTFY_URL="" \
GHNTFY_TIMEOUT="3600" \
GHNTFY_TOKEN="" \
DOCKER_USERNAME="" \
DOCKER_PASSWORD="" \
GOTIFY_URL="" \
GOTIFY_TOKEN="" \
DISCORD_WEBHOOK_URL="" \
SLACK_WEBHOOK_URL="" \
FLASK_ENV=production
# Exposer le port 5000 pour l'API et le port 80 pour le serveur web
EXPOSE 5000 80
WORKDIR /app
# Copier les fichiers web dans le répertoire attendu par nginx
COPY web-output/public /var/www/html/
COPY nginx.conf /etc/nginx/nginx.conf
ENTRYPOINT ["/entrypoint.sh"]
# Copier le script d'entrée
COPY entrypoint.sh /app/entrypoint.sh
RUN chmod +x /app/entrypoint.sh
# Créer le répertoire de données et définir les permissions
RUN mkdir -p /github-ntfy && chmod 755 /github-ntfy
# Variables d'environnement (optionnelles)
ENV DB_PATH=/github-ntfy
ENV RUST_LOG=info
# Volumes pour la persistance des données
VOLUME ["/github-ntfy"]
EXPOSE 5000 80
ENTRYPOINT ["/app/entrypoint.sh"]

141
README.md
View File

@@ -1,6 +1,6 @@
<h1 align="center">Welcome to ntfy_alerts 👋</h1>
<p>
<img alt="Version" src="https://img.shields.io/badge/version-1.5-blue.svg?cacheSeconds=2592000" />
<img alt="Version" src="https://img.shields.io/badge/version-2.1-blue.svg?cacheSeconds=2592000" />
<a href="#" target="_blank">
<img alt="License: GPL--3" src="https://img.shields.io/badge/License-GPL--3-yellow.svg" />
</a>
@@ -9,115 +9,72 @@
</a>
</p>
> This project allows you to receive notifications about new GitHub or Docker Hub releases on ntfy, gotify, and Discord.
> This project allows you to receive notifications about new GitHub or Docker Hub releases on ntfy, gotify, Discord and Slack. Implemented in Rust for better performance.
## Installation
To install the dependencies, run:
```sh
pip install -r requirements.txt
```
### Docker (recommended)
## Usage
Use our Docker image, which automatically supports amd64, arm64 and armv7:
If you want to use the Docker image, you can use the following docker-compose file for x86_64:
````yaml
```yaml
services:
github-ntfy:
image: breizhhardware/github-ntfy:latest
container_name: github-ntfy
environment:
- USERNAME=username # Required
- PASSWORD=password # Required
- NTFY_URL=ntfy_url # Required if ntfy is used
- GHNTFY_TIMEOUT=timeout # Default is 3600 (1 hour)
- GHNTFY_TOKEN= # Default is empty (Github token)
- DOCKER_USERNAME= # Default is empty (Docker Hub username)
- DOCKER_PASSWORD= # Default is empty (Docker Hub password)
- GOTIFY_URL=gotify_url # Required if gotify is used
- GOTIFY_TOKEN= # Required if gotify is used
- DISCORD_WEBHOOK_URL= # Required if discord is used
- SLACK_WEBHOOK_URL= # Required if Slack is used
volumes:
- /path/to/github-ntfy:/github-ntfy/
- /path/to/data:/data
ports:
- 80:80
restart: unless-stopped
````
For arm64 this docker compose file is ok:
````yaml
services:
github-ntfy:
image: breizhhardware/github-ntfy:arm64
container_name: github-ntfy
environment:
- USERNAME=username # Required
- PASSWORD=password # Required
- NTFY_URL=ntfy_url # Required if ntfy is used
- GHNTFY_TIMEOUT=timeout # Default is 3600 (1 hour)
- GHNTFY_TOKEN= # Default is empty (Github token)
- DOCKER_USERNAME= # Default is empty (Docker Hub username)
- DOCKER_PASSWORD= # Default is empty (Docker Hub password)
- GOTIFY_URL=gotify_url # Required if gotify is used
- GOTIFY_TOKEN= # Required if gotify is used
- DISCORD_WEBHOOK_URL= # Required if discord is used
- SLACK_WEBHOOK_URL= # Required if Slack is used
volumes:
- /path/to/github-ntfy:/github-ntfy/
ports:
- 80:80
restart: unless-stopped
````
For armV7 this docker compose is ok:
````yaml
services:
github-ntfy:
image: breizhhardware/github-ntfy:armv7
container_name: github-ntfy
environment:
- USERNAME=username # Required
- PASSWORD=password # Required
- NTFY_URL=ntfy_url # Required if ntfy is used
- GHNTFY_TIMEOUT=timeout # Default is 3600 (1 hour)
- GHNTFY_TOKEN= # Default is empty (Github token)
- DOCKER_USERNAME= # Default is empty (Docker Hub username)
- DOCKER_PASSWORD= # Default is empty (Docker Hub password)
- GOTIFY_URL=gotify_url # Required if gotify is used
- GOTIFY_TOKEN= # Required if gotify is used
- DISCORD_WEBHOOK_URL= # Required if discord is used
- SLACK_WEBHOOK_URL= # Required if Slack is used
volumes:
- /path/to/github-ntfy:/github-ntfy/
ports:
- 80:80
restart: unless-stopped
````
GHNTFY_TOKEN is a github token, it need to have repo, read:org and read:user
```
### Manual Installation
Install Rust if needed
```BASH
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
```
Clone the repository
```BASH
git clone https://github.com/BreizhHardware/ntfy_alerts.git
cd ntfy_alerts
```
Compile
```BASH
cargo build --release
```
Run
```BASH
./target/release/github-ntfy
```
## Version Notes
- v2.0: Complete rewrite in Rust for better performance and reduced resource consumption
- [v1.7.1](https://github.com/BreizhHardware/ntfy_alerts/tree/v1.7.2): Stable Python version
## Configuration
The GitHub token (GHNTFY_TOKEN) needs to have the following permissions: repo, read:org and read:user.
## TODO
- [ ] Add support for multi achitecture Docker images
- [x] Rework web interface
- [ ] Add support for more notification services (Telegram, Matrix, etc.)
- [x] Add web oneboarding instead of using environment variables
## Author
👤 BreizhHardware
👤 **BreizhHardware**
* Website: https://mrqt.fr?ref=github
* Twitter: [@BreizhHardware](https://twitter.com/BreizhHardware)
* Github: [@BreizhHardware](https://github.com/BreizhHardware)
* LinkedIn: [@félix-marquet-5071bb167](https://linkedin.com/in/félix-marquet-5071bb167)
- Website: [https://mrqt.fr](https://mrqt.fr?ref=github)
- Twitter: [@BreizhHardware](https://twitter.com/BreizhHardware)
- Github: [@BreizhHardware](https://github.com/BreizhHardware)
- LinkedIn: [@félix-marquet-5071bb167](https://linkedin.com/in/félix-marquet-5071bb167)
## Contribution
If you want to contribut, feel free to open a pull request, but first read the [contribution guide](CONTRIBUTION.md)!
## TODO:
- [x] Dockerize the ntfy.py
- [x] Add the watched repos list as a parameter
- [x] Add the application version as a database
- [x] Add the watched repos list as a web interface
- [x] Add Docker Hub compatibility
- [ ] Rework of the web interface
- [x] Compatibility with Gotify
- [x] Compatibility with Discord Webhook
- [x] Compatibility and distribution for arm64 and armv7
## Contributing
Contributions are what make the open-source community such an amazing place to learn, inspire, and create. Any contributions you make are **greatly appreciated**. But first, please read the [CONTRIBUTION.md](CONTRIBUTION.md) file.
## Show your support
Give a ⭐️ if this project helped you!

View File

@@ -1,10 +1,25 @@
#!/bin/sh
# Génère le contenu du fichier auth.txt à partir des variables d'environnement
echo -n "$USERNAME:$PASSWORD" | base64 > /auth.txt
# Check if USERNAME and PASSWORD environment variables are defined
if [ -n "$USERNAME" ] && [ -n "$PASSWORD" ]; then
# Generate auth.txt file content from environment variables
echo -n "$USERNAME:$PASSWORD" > /auth.txt
echo "Authentication file generated from environment variables"
else
echo "USERNAME and/or PASSWORD variables not defined"
echo "Authentication will be managed by the onboarding system via the web interface"
fi
# Démarrer nginx en arrière-plan
# Set database directory permissions
if [ -d "/github-ntfy" ]; then
chmod -R 755 /github-ntfy
echo "Permissions applied to data directory"
fi
# Start nginx in the background
echo "Starting Nginx..."
nginx -g 'daemon off;' &
# Exécute le script Python
exec python ./ntfy.py
# Start the main application
echo "Starting application..."
exec /usr/local/bin/github-ntfy

View File

@@ -1,69 +0,0 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Github-Ntfy Add a Repo</title>
<script src="https://cdn.tailwindcss.com"></script>
<script src="./script.js" defer></script>
</head>
<body class="bg-[#1b2124] text-gray-200">
<header class="text-center py-8 bg-[#23453d] shadow-lg">
<h1 class="text-5xl font-bold tracking-wide text-white">Github-Ntfy</h1>
</header>
<main class="flex flex-wrap justify-center gap-8 py-12">
<!-- Github Repo Section -->
<section class="bg-[#23453d] rounded-lg shadow-lg p-6 w-full max-w-lg">
<h2 class="text-2xl font-semibold mb-4">Add a Github Repo</h2>
<form id="addRepoForm" class="space-y-6">
<div>
<label for="repo" class="block text-sm font-medium">Name of the Github Repo</label>
<div class="mt-2 flex items-center border rounded-md bg-gray-700">
<span class="px-3 text-gray-400">github.com/</span>
<input type="text" name="repo" id="repo" autocomplete="repo" class="flex-1 py-2 px-3 bg-transparent focus:outline-none" placeholder="BreizhHardware/ntfy_alerts">
</div>
</div>
<div class="flex justify-end gap-4">
<button type="button" class="px-4 py-2 text-gray-400 hover:text-white">Cancel</button>
<button type="submit" class="px-4 py-2 bg-green-700 hover:bg-green-600 text-white font-semibold rounded-md">Save</button>
</div>
</form>
<div class="mt-8">
<h3 class="text-lg font-semibold mb-2">Watched Github Repositories</h3>
<ul id="watchedReposList" class="space-y-2">
<!-- Dynamically populated with JavaScript -->
</ul>
</div>
</section>
<!-- Docker Repo Section -->
<section class="bg-[#23453d] rounded-lg shadow-lg p-6 w-full max-w-lg">
<h2 class="text-2xl font-semibold mb-4">Add a Docker Repo</h2>
<form id="addDockerRepoForm" class="space-y-6">
<div>
<label for="dockerRepo" class="block text-sm font-medium">Name of the Docker Repo</label>
<div class="mt-2 flex items-center border rounded-md bg-gray-700">
<span class="px-3 text-gray-400">hub.docker.com/r/</span>
<input type="text" name="dockerRepo" id="dockerRepo" autocomplete="dockerRepo" class="flex-1 py-2 px-3 bg-transparent focus:outline-none" placeholder="breizhhardware/github-ntfy">
</div>
</div>
<div class="flex justify-end gap-4">
<button type="button" class="px-4 py-2 text-gray-400 hover:text-white">Cancel</button>
<button type="submit" class="px-4 py-2 bg-green-700 hover:bg-green-600 text-white font-semibold rounded-md">Save</button>
</div>
</form>
<div class="mt-8">
<h3 class="text-lg font-semibold mb-2">Watched Docker Repositories</h3>
<ul id="watchedDockerReposList" class="space-y-2">
<!-- Dynamically populated with JavaScript -->
</ul>
</div>
</section>
</main>
<footer class="text-center py-6 bg-[#23453d]">
<p class="text-sm">I know this web interface is simple, but I'm improving!</p>
</footer>
</body>
</html>

View File

@@ -6,55 +6,52 @@ http {
include mime.types;
default_type application/octet-stream;
# Ajout pour gérer les fichiers statiques correctement
sendfile on;
keepalive_timeout 65;
# Ajout de cette variable pour préserver le port dans les redirections
port_in_redirect off;
absolute_redirect off;
server {
listen 80;
server_name _;
# Configuration pour servir le frontend Nuxt statique
location / {
root /var/www/html;
index index.html;
root /var/www/html;
index index.html;
try_files $uri $uri/ /index.html;
# Activer les options pour faciliter le débogage
add_header X-Content-Type-Options "nosniff";
add_header X-Frame-Options "DENY";
add_header X-Served-By "nginx";
}
location /app_repo {
# Configuration groupée pour toutes les routes API
location ~* ^/(app_github_repo|app_docker_repo|watched_repos|watched_docker_repos|delete_repo|delete_docker_repo|latest_updates|auth|settings|is_configured) {
proxy_pass http://127.0.0.1:5000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
location /watched_repos {
proxy_pass http://127.0.0.1:5000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
location /delete_repo {
proxy_pass http://127.0.0.1:5000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
location /app_docker_repo {
proxy_pass http://127.0.0.1:5000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
location /watched_docker_repos {
proxy_pass http://127.0.0.1:5000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
location /delete_docker_repo {
proxy_pass http://127.0.0.1:5000;
proxy_set_header Host $host;
proxy_set_header Host $host:$server_port;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header X-Forwarded-Port $server_port;
# Configuration importante pour les WebSockets si utilisés
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
# Augmenter les timeouts pour les requêtes longues
proxy_connect_timeout 60s;
proxy_send_timeout 60s;
proxy_read_timeout 60s;
}
# Ajouter des logs pour le débogage
error_log /var/log/nginx/error.log warn;
access_log /var/log/nginx/access.log;
}
}

255
ntfy.py
View File

@@ -1,255 +0,0 @@
import requests
import time
import os
import logging
import sqlite3
import subprocess
import json
import threading
from send_ntfy import (
github_send_to_ntfy,
docker_send_to_ntfy,
)
from send_gotify import (
github_send_to_gotify,
docker_send_to_gotify,
)
from send_discord import (
github_send_to_discord,
docker_send_to_discord,
)
from send_slack import (
github_send_to_slack,
docker_send_to_slack,
)
# Configuring the logger
logging.basicConfig(
level=logging.INFO,
format="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
)
logger = logging.getLogger(__name__)
github_token = os.environ.get("GHNTFY_TOKEN")
github_headers = {}
if github_token:
github_headers["Authorization"] = f"token {github_token}"
docker_username = os.environ.get("DOCKER_USERNAME")
docker_password = os.environ.get("DOCKER_PASSWORD")
discord_webhook_url = os.environ.get("DISCORD_WEBHOOK_URL")
def create_dockerhub_token(username, password):
url = "https://hub.docker.com/v2/users/login"
headers = {"Content-Type": "application/json"}
data = json.dumps({"username": username, "password": password})
response = requests.post(url, headers=headers, data=data)
if response.status_code == 200:
token = response.json().get("token")
if token:
return token
else:
logger.error("Failed to get Docker Hub token.")
else:
logger.error(f"Failed to get Docker Hub token. Status code: {response.status_code}")
return None
docker_token = create_dockerhub_token(docker_username, docker_password)
docker_header = {}
if docker_token:
docker_header["Authorization"] = f"Bearer {docker_token}"
# Connecting to the database to store previous versions
conn = sqlite3.connect(
"/github-ntfy/ghntfy_versions.db",
check_same_thread=False,
)
cursor = conn.cursor()
# Creating the table if it does not exist
cursor.execute(
"""CREATE TABLE IF NOT EXISTS versions
(repo TEXT PRIMARY KEY, version TEXT, changelog TEXT)"""
)
conn.commit()
cursor.execute(
"""CREATE TABLE IF NOT EXISTS docker_versions
(repo TEXT PRIMARY KEY, digest TEXT)"""
)
conn.commit()
logger.info("Starting version monitoring...")
conn2 = sqlite3.connect("/github-ntfy/watched_repos.db", check_same_thread=False)
cursor2 = conn2.cursor()
cursor2.execute(
"""CREATE TABLE IF NOT EXISTS watched_repos
(id INTEGER PRIMARY KEY, repo TEXT)"""
)
conn2.commit()
cursor2.execute(
"""CREATE TABLE IF NOT EXISTS docker_watched_repos
(id INTEGER PRIMARY KEY, repo TEXT)"""
)
conn2.commit()
def get_watched_repos():
cursor2.execute("SELECT * FROM watched_repos")
watched_repos_rows = cursor2.fetchall()
watched_repos = []
for repo in watched_repos_rows:
watched_repos.append(repo[1])
return watched_repos
def get_docker_watched_repos():
cursor2.execute("SELECT * FROM docker_watched_repos")
watched_repos_rows = cursor2.fetchall()
watched_repos = []
for repo in watched_repos_rows:
watched_repos.append(repo[1])
return watched_repos
def start_api():
subprocess.Popen(["python", "ntfy_api.py"])
def get_latest_releases(watched_repos):
releases = []
for repo in watched_repos:
url = f"https://api.github.com/repos/{repo}/releases/latest"
response = requests.get(url, headers=github_headers)
if response.status_code == 200:
release_info = response.json()
changelog = get_changelog(repo)
release_date = release_info.get("published_at", "Release date not available")
releases.append(
{
"repo": repo,
"name": release_info["name"],
"tag_name": release_info["tag_name"],
"html_url": release_info["html_url"],
"changelog": changelog,
"published_at": release_date,
}
)
else:
logger.error(f"Failed to fetch release info for {repo}")
return releases
def get_latest_docker_releases(watched_repos):
releases = []
for repo in watched_repos:
url = f"https://hub.docker.com/v2/repositories/{repo}/tags/latest"
response = requests.get(url, headers=docker_header)
if response.status_code == 200:
release_info = response.json()
release_date = release_info["last_upated"]
digest = release_date["digest"]
releases.append(
{
"repo": repo,
"digest": digest,
"html_url": "https://hub.docker.com/r/" + repo,
"published_at": release_date,
}
)
else:
logger.error(f"Failed to fetch Docker Hub info for {repo}")
return releases
def get_changelog(repo):
url = f"https://api.github.com/repos/{repo}/releases"
response = requests.get(url, headers=github_headers)
if response.status_code == 200:
releases = response.json()
if releases:
latest_release_list = releases[0]
if "body" in latest_release_list:
return latest_release_list["body"]
return "Changelog not available"
def notify_all_services(github_latest_release, docker_latest_release, auth, ntfy_url, gotify_url, gotify_token, discord_webhook_url, slack_webhook_url):
threads = []
if ntfy_url:
if github_latest_release:
threads.append(threading.Thread(target=github_send_to_ntfy, args=(github_latest_release, auth, ntfy_url)))
if docker_latest_release:
threads.append(threading.Thread(target=docker_send_to_ntfy, args=(docker_latest_release, auth, ntfy_url)))
if gotify_url and gotify_token:
if github_latest_release:
threads.append(threading.Thread(target=github_send_to_gotify, args=(github_latest_release, gotify_token, gotify_url)))
if docker_latest_release:
threads.append(threading.Thread(target=docker_send_to_gotify, args=(docker_latest_release, gotify_token, gotify_url)))
if discord_webhook_url:
if github_latest_release:
threads.append(threading.Thread(target=github_send_to_discord, args=(github_latest_release, discord_webhook_url)))
if docker_latest_release:
threads.append(threading.Thread(target=docker_send_to_discord, args=(docker_latest_release, discord_webhook_url)))
if slack_webhook_url:
if github_latest_release:
threads.append(threading.Thread(target=github_send_to_slack, args=(github_latest_release, slack_webhook_url)))
if docker_latest_release:
threads.append(threading.Thread(target=docker_send_to_slack, args=(docker_latest_release, slack_webhook_url)))
for thread in threads:
thread.start()
for thread in threads:
thread.join()
if __name__ == "__main__":
start_api()
with open("/auth.txt", "r") as f:
auth = f.read().strip()
ntfy_url = os.environ.get("NTFY_URL")
gotify_url = os.environ.get("GOTIFY_URL")
gotify_token = os.environ.get("GOTIFY_TOKEN")
discord_webhook_url = os.environ.get("DISCORD_WEBHOOK_URL")
timeout = float(os.environ.get("GHNTFY_TIMEOUT"))
slack_webhook_url = os.environ.get("SLACK_WEBHOOK_URL")
if auth and (ntfy_url or gotify_url or discord_webhook_url):
while True:
github_watched_repos_list = get_watched_repos()
github_latest_release = get_latest_releases(github_watched_repos_list)
docker_watched_repos_list = get_docker_watched_repos()
docker_latest_release = get_latest_docker_releases(docker_watched_repos_list)
notify_all_services(github_latest_release, docker_latest_release, auth, ntfy_url, gotify_url, gotify_token, discord_webhook_url, slack_webhook_url)
time.sleep(timeout)
else:
logger.error("Usage: python ntfy.py")
logger.error(
"auth: can be generataed by the folowing command: echo -n 'username:password' | base64 and need to be "
"stored in a file named auth.txt"
)
logger.error("NTFY_URL: the url of the ntfy server need to be stored in an environment variable named NTFY_URL")
logger.error(
"GOTIFY_URL: the url of the gotify server need to be stored in an environment variable named GOTIFY_URL"
)
logger.error(
"GOTIFY_TOKEN: the token of the gotify server need to be stored in an environment variable named GOTIFY_TOKEN"
)
logger.error("DISCORD_WEBHOOK_URL: the webhook URL for Discord notifications need to be stored in an environment variable named DISCORD_WEBHOOK_URL")
logger.error("GHNTFY_TIMEOUT: the time interval between each check")

View File

@@ -1,207 +0,0 @@
from flask import Flask, request, jsonify
from flask_cors import CORS
import sqlite3
app = Flask(__name__)
CORS(app)
app.logger.setLevel("WARNING")
def get_db_connection():
conn = sqlite3.connect("/github-ntfy/watched_repos.db")
conn.row_factory = sqlite3.Row
return conn
def close_db_connection(conn):
conn.close()
@app.route("/app_repo", methods=["POST"])
def app_repo():
data = request.json
repo = data.get("repo")
# Vérifier si le champ 'repo' est présent dans les données JSON
if not repo:
return (
jsonify({"error": "The repo field is required."}),
400,
)
# Établir une connexion à la base de données
conn = get_db_connection()
cursor = conn.cursor()
try:
# Vérifier si le dépôt existe déjà dans la base de données
cursor.execute(
"SELECT * FROM watched_repos WHERE repo=?",
(repo,),
)
existing_repo = cursor.fetchone()
if existing_repo:
return (
jsonify({"error": f"The GitHub repo {repo} is already in the database."}),
409,
)
# Ajouter le dépôt à la base de données
cursor.execute(
"INSERT INTO watched_repos (repo) VALUES (?)",
(repo,),
)
conn.commit()
return jsonify({"message": f"The GitHub repo {repo} as been added to the watched repos."})
finally:
# Fermer la connexion à la base de données
close_db_connection(conn)
@app.route("/app_docker_repo", methods=["POST"])
def app_docker_repo():
data = request.json
repo = data.get("repo")
# Vérifier si le champ 'repo' est présent dans les données JSON
if not repo:
return (
jsonify({"error": "The repo field is required."}),
400,
)
# Établir une connexion à la base de données
conn = get_db_connection()
cursor = conn.cursor()
try:
# Vérifier si le dépôt existe déjà dans la base de données
cursor.execute(
"SELECT * FROM docker_watched_repos WHERE repo=?",
(repo,),
)
existing_repo = cursor.fetchone()
if existing_repo:
return (
jsonify({"error": f"The Docker repo {repo} is already in the database."}),
409,
)
# Ajouter le dépôt à la base de données
cursor.execute(
"INSERT INTO docker_watched_repos (repo) VALUES (?)",
(repo,),
)
conn.commit()
return jsonify({"message": f"The Docker repo {repo} as been added to the watched repos."})
finally:
# Fermer la connexion à la base de données
close_db_connection(conn)
@app.route("/watched_repos", methods=["GET"])
def get_watched_repos():
db = get_db_connection()
cursor = db.cursor()
cursor.execute("SELECT repo FROM watched_repos")
watched_repos = [repo[0] for repo in cursor.fetchall()]
cursor.close()
db.close()
return jsonify(watched_repos)
@app.route("/watched_docker_repos", methods=["GET"])
def get_watched_docker_repos():
db = get_db_connection()
cursor = db.cursor()
cursor.execute("SELECT repo FROM docker_watched_repos")
watched_repos = [repo[0] for repo in cursor.fetchall()]
cursor.close()
db.close()
return jsonify(watched_repos)
@app.route("/delete_repo", methods=["POST"])
def delete_repo():
data = request.json
repo = data.get("repo")
# Vérifier si le champ 'repo' est présent dans les données JSON
if not repo:
return (
jsonify({"error": "The repo field is required."}),
400,
)
# Établir une connexion à la base de données
conn = get_db_connection()
cursor = conn.cursor()
try:
# Vérifier si le dépôt existe dans la base de données
cursor.execute(
"SELECT * FROM watched_repos WHERE repo=?",
(repo,),
)
existing_repo = cursor.fetchone()
if not existing_repo:
return (
jsonify({"error": f"The GitHub repo {repo} is not in the database."}),
404,
)
# Supprimer le dépôt de la base de données
cursor.execute(
"DELETE FROM watched_repos WHERE repo=?",
(repo,),
)
conn.commit()
return jsonify({"message": f"The GitHub repo {repo} as been deleted from the watched repos."})
finally:
# Fermer la connexion à la base de données
close_db_connection(conn)
@app.route("/delete_docker_repo", methods=["POST"])
def delete_docker_repo():
data = request.json
repo = data.get("repo")
# Vérifier si le champ 'repo' est présent dans les données JSON
if not repo:
return (
jsonify({"error": "The repo field is required."}),
400,
)
# Établir une connexion à la base de données
conn = get_db_connection()
cursor = conn.cursor()
try:
# Vérifier si le dépôt existe dans la base de données
cursor.execute(
"SELECT * FROM docker_watched_repos WHERE repo=?",
(repo,),
)
existing_repo = cursor.fetchone()
if not existing_repo:
return (
jsonify({"error": f"The Docker repo {repo} is not in the database."}),
404,
)
# Supprimer le dépôt de la base de données
cursor.execute(
"DELETE FROM docker_watched_repos WHERE repo=?",
(repo,),
)
conn.commit()
return jsonify({"message": f"The Docker repo {repo} as been deleted from the watched repos."})
finally:
# Fermer la connexion à la base de données
close_db_connection(conn)
if __name__ == "__main__":
app.run(debug=False)

View File

@@ -1,2 +0,0 @@
[tool.black]
line-length = 120

View File

@@ -1,4 +0,0 @@
requests==2.31.0
pysqlite3==0.5.2
flask==3.0.2
flask-cors==4.0.0

158
script.js
View File

@@ -1,158 +0,0 @@
document.getElementById('addRepoForm').addEventListener('submit', function(event) {
event.preventDefault();
let repoName = document.getElementById('repo').value;
fetch('/app_repo', {
method: 'POST',
headers: {
'Access-Control-Allow-Origin': '*',
'Content-Type': 'application/json'
},
body: JSON.stringify({repo: repoName})
})
.then(response => {
if (response.ok) {
// Si la requête s'est bien déroulée, actualiser la liste des dépôts surveillés
refreshWatchedRepos();
} else {
throw new Error('Erreur lors de l\'ajout du dépôt');
}
})
.catch(error => {
console.error('Error:', error);
});
});
document.getElementById('addDockerRepoForm').addEventListener('submit', function(event) {
event.preventDefault();
let repoName = document.getElementById('dockerRepo').value;
fetch('/app_docker_repo', {
method: 'POST',
headers: {
'Access-Control-Allow-Origin': '*',
'Content-Type': 'application/json'
},
body: JSON.stringify({repo: repoName})
})
.then(response => {
if (response.ok) {
// Si la requête s'est bien déroulée, actualiser la liste des dépôts surveillés
refreshWatchedRepos();
} else {
throw new Error('Erreur lors de l\'ajout du dépôt');
}
})
.catch(error => {
console.error('Error:', error);
});
});
function refreshWatchedRepos() {
fetch('/watched_repos')
.then(response => response.json())
.then(data => {
const watchedReposList = document.getElementById('watchedReposList');
// Vider la liste actuelle
watchedReposList.innerHTML = '';
// Ajouter chaque dépôt surveillé à la liste
data.forEach(repo => {
const listItem = document.createElement('li');
const repoName = document.createElement('span');
repoName.textContent = repo;
repoName.className = 'repo-name';
listItem.appendChild(repoName);
const deleteButton = document.createElement('button');
deleteButton.textContent = ' X';
deleteButton.className = 'delete-btn text-red-500 ml-2';
deleteButton.addEventListener('click', () => {
// Remove the repo from the watched repos
// This is a placeholder. Replace it with your actual code to remove the repo from the watched repos.
removeRepoFromWatchedRepos(repo);
// Remove the repo from the DOM
listItem.remove();
});
listItem.appendChild(deleteButton);
watchedReposList.appendChild(listItem);
});
})
.catch(error => {
console.error('Error:', error);
});
fetch('/watched_docker_repos')
.then(response => response.json())
.then(data => {
const watchedDockerReposList = document.getElementById('watchedDockerReposList');
// Vider la liste actuelle
watchedDockerReposList.innerHTML = '';
// Ajouter chaque dépôt surveillé à la liste
data.forEach(repo => {
const listItem = document.createElement('li');
const repoName = document.createElement('span');
repoName.textContent = repo;
repoName.className = 'repo-name';
listItem.appendChild(repoName);
const deleteButton = document.createElement('button');
deleteButton.textContent = ' X';
deleteButton.className = 'delete-btn text-red-500 ml-2';
deleteButton.addEventListener('click', () => {
// Remove the repo from the watched repos
// This is a placeholder. Replace it with your actual code to remove the repo from the watched repos.
removeDockerRepoFromWatchedRepos(repo);
// Remove the repo from the DOM
listItem.remove();
});
listItem.appendChild(deleteButton);
watchedDockerReposList.appendChild(listItem);
});
})
.catch(error => {
console.error('Error:', error);
});
}
function removeRepoFromWatchedRepos(repo) {
fetch('/delete_repo', {
method: 'POST',
headers: {
'Access-Control-Allow-Origin': '*',
'Content-Type': 'application/json'
},
body: JSON.stringify({repo: repo})
})
.then(response => {
if (!response.ok) {
throw new Error('Erreur lors de la suppression du dépôt');
}
})
.catch(error => {
console.error('Error:', error);
});
}
function removeDockerRepoFromWatchedRepos(repo) {
fetch('/delete_docker_repo', {
method: 'POST',
headers: {
'Access-Control-Allow-Origin': '*',
'Content-Type': 'application/json'
},
body: JSON.stringify({repo: repo})
})
.then(response => {
if (!response.ok) {
throw new Error('Erreur lors de la suppression du dépôt');
}
})
.catch(error => {
console.error('Error:', error);
});
}
// Appeler la fonction pour charger les dépôts surveillés au chargement de la page
refreshWatchedRepos();

View File

@@ -1,94 +0,0 @@
import requests
import sqlite3
import logging
logging.basicConfig(
level=logging.INFO,
format="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
)
logger = logging.getLogger(__name__)
def get_db_connection():
return sqlite3.connect("/github-ntfy/ghntfy_versions.db", check_same_thread=False)
def github_send_to_discord(releases, webhook_url):
conn = get_db_connection()
cursor = conn.cursor()
for release in releases:
app_name = release["repo"].split("/")[-1]
version_number = release["tag_name"]
app_url = release["html_url"]
changelog = release["changelog"]
release_date = release["published_at"].replace("T", " ").replace("Z", "")
cursor.execute("SELECT version FROM versions WHERE repo=?", (app_name,))
previous_version = cursor.fetchone()
if previous_version and previous_version[0] == version_number:
logger.info(f"The version of {app_name} has not changed. No notification sent.")
continue # Move on to the next application
message = f"📌 *New version*: {version_number}\n\n📦*For*: {app_name}\n\n📅 *Published on*: {release_date}\n\n📝 *Changelog*:\n\n```{changelog}```"
if len(message) > 2000:
message = f"📌 *New version*: {version_number}\n\n📦*For*: {app_name}\n\n📅 *Published on*: {release_date}\n\n🔗 *Release Link*: {app_url}"
# Updating the previous version for this application
cursor.execute(
"INSERT OR REPLACE INTO versions (repo, version, changelog) VALUES (?, ?, ?)",
(app_name, version_number, changelog),
)
conn.commit()
data = {
"content": message,
"username": "GitHub Ntfy"
}
headers = {
"Content-Type": "application/json"
}
response = requests.post(webhook_url, json=data, headers=headers)
if 200 <= response.status_code < 300:
logger.info(f"Message sent to Discord for {app_name}")
else:
logger.error(f"Failed to send message to Discord. Status code: {response.status_code}")
logger.error(f"Response: {response.text}")
conn.close()
def docker_send_to_discord(releases, webhook_url):
conn = get_db_connection()
cursor = conn.cursor()
for release in releases:
app_name = release["repo"].split("/")[-1]
digest_number = release["digest"]
app_url = release["html_url"]
release_date = release["published_at"].replace("T", " ").replace("Z", "")
cursor.execute("SELECT digest FROM docker_versions WHERE repo=?", (app_name,))
previous_digest = cursor.fetchone()
if previous_digest and previous_digest[0] == digest_number:
logger.info(f"The digest of {app_name} has not changed. No notification sent.")
continue
message = f"🐳 *Docker Image Updated!*\n\n🔐 *New Digest*: `{digest_number}`\n\n📦 *App*: {app_name}\n\n📢*Published*: {release_date}\n\n🔗 *Link*: {app_url}"
cursor.execute(
"INSERT OR REPLACE INTO docker_versions (repo, digest) VALUES (?, ?)",
(app_name, digest_number),
)
conn.commit()
data = {
"content": message,
"username": "GitHub Ntfy"
}
headers = {
"Content-Type": "application/json"
}
logger.info(f"Sending payload to Discord: {data}")
response = requests.post(webhook_url, json=data, headers=headers)
if 200 <= response.status_code < 300:
logger.info(f"Message sent to Discord for {app_name}")
else:
logger.error(f"Failed to send message to Discord. Status code: {response.status_code}")
logger.error(f"Response: {response.text}")
conn.close()

View File

@@ -1,98 +0,0 @@
import requests
import sqlite3
import logging
logging.basicConfig(
level=logging.INFO,
format="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
)
logger = logging.getLogger(__name__)
def get_db_connection():
return sqlite3.connect("/github-ntfy/ghntfy_versions.db", check_same_thread=False)
def github_send_to_gotify(releases, token, url):
conn = get_db_connection()
cursor = conn.cursor()
url = url + "/message"
url = url + "?token=" + token
for release in releases:
app_name = release["repo"].split("/")[-1] # Getting the application name from the repo
version_number = release["tag_name"] # Getting the version number
app_url = release["html_url"] # Getting the application URL
changelog = release["changelog"] # Getting the changelog
release_date = release["published_at"] # Getting the release date
release_date = release_date.replace("T", " ").replace("Z", "") # Formatting the release date
# Checking if the version has changed since the last time
cursor.execute(
"SELECT version FROM versions WHERE repo=?",
(app_name,),
)
previous_version = cursor.fetchone()
if previous_version and previous_version[0] == version_number:
logger.info(f"The version of {app_name} has not changed. No notification sent.")
continue # Move on to the next application
message = f"📌 *New version*: {version_number}\n\n📦*For*: {app_name}\n\n📅 *Published on*: {release_date}\n\n📝 *Changelog*:\n\n```{changelog}```\n\n🔗 *Release Url*:{app_url}"
# Updating the previous version for this application
cursor.execute(
"INSERT OR REPLACE INTO versions (repo, version, changelog) VALUES (?, ?, ?)",
(app_name, version_number, changelog),
)
conn.commit()
content = {
"title": f"New version for {app_name}",
"message": message,
"priority": "2",
}
response = requests.post(url, json=content)
if response.status_code == 200:
logger.info(f"Message sent to Gotify for {app_name}")
continue
else:
logger.error(f"Failed to send message to Gotify. Status code: {response.status_code}")
def docker_send_to_gotify(releases, token, url):
conn = get_db_connection()
cursor = conn.cursor()
url = url + "/message"
url = url + "?token=" + token
for release in releases:
app_name = release["repo"].split("/")[-1] # Getting the application name from the repo
digest_number = release["digest"]
app_url = release["html_url"] # Getting the application URL
release_date = release["published_at"] # Getting the release date
release_date = release_date.replace("T", " ").replace("Z", "") # Formatting the release date
# Checking if the version has changed since the last time
cursor.execute(
"SELECT digest FROM docker_versions WHERE repo=?",
(app_name,),
)
previous_digest = cursor.fetchone()
if previous_digest and previous_digest[0] == digest_number:
logger.info(f"The digest of {app_name} has not changed. No notification sent.")
continue # Move on to the next application
message = f"🐳 *Docker Image Updated!*\n\n🔐 *New Digest*: `{digest_number}`\n\n📦 *App*: {app_name}\n\n📢 *Published*: {release_date}\n\n🔗 *Release Url*:{app_url}"
# Updating the previous digest for this application
cursor.execute(
"INSERT OR REPLACE INTO docker_versions (repo, digest) VALUES (?, ?, ?)",
(app_name, digest_number),
)
conn.commit()
content = {
"title": f"New version for {app_name}",
"message": message,
"priority": "2",
}
response = requests.post(url, json=content)
if response.status_code == 200:
logger.info(f"Message sent to Gotify for {app_name}")
continue
else:
logger.error(f"Failed to send message to Gotify. Status code: {response.status_code}")

View File

@@ -1,98 +0,0 @@
import requests
import sqlite3
import logging
logging.basicConfig(
level=logging.INFO,
format="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
)
logger = logging.getLogger(__name__)
def get_db_connection():
return sqlite3.connect("/github-ntfy/ghntfy_versions.db", check_same_thread=False)
def github_send_to_ntfy(releases, auth, url):
conn = get_db_connection()
cursor = conn.cursor()
for release in releases:
app_name = release["repo"].split("/")[-1] # Getting the application name from the repo
version_number = release["tag_name"] # Getting the version number
app_url = release["html_url"] # Getting the application URL
changelog = release["changelog"] # Getting the changelog
release_date = release["published_at"] # Getting the release date
release_date = release_date.replace("T", " ").replace("Z", "") # Formatting the release date
# Checking if the version has changed since the last time
cursor.execute(
"SELECT version FROM versions WHERE repo=?",
(app_name,),
)
previous_version = cursor.fetchone()
if previous_version and previous_version[0] == version_number:
logger.info(f"The version of {app_name} has not changed. No notification sent.")
continue # Move on to the next application
message = f"📌 *New version*: {version_number}\n\n📦*For*: {app_name}\n\n📅 *Published on*: {release_date}\n\n📝 *Changelog*:\n\n```{changelog}```\n\n 🔗 *Release Url*: {app_url}"
# Updating the previous version for this application
cursor.execute(
"INSERT OR REPLACE INTO versions (repo, version, changelog) VALUES (?, ?, ?)",
(app_name, version_number, changelog),
)
conn.commit()
headers = {
"Authorization": f"Basic {auth}",
"Title": f"New version for {app_name}",
"Priority": "urgent",
"Markdown": "yes",
"Actions": f"view, Update {app_name}, {app_url}, clear=true",
}
response = requests.post(f"{url}", headers=headers, data=message)
if response.status_code == 200:
logger.info(f"Message sent to Ntfy for {app_name}")
continue
else:
logger.error(f"Failed to send message to Ntfy. Status code: {response.status_code}")
def docker_send_to_ntfy(releases, auth, url):
conn = get_db_connection()
cursor = conn.cursor()
for release in releases:
app_name = release["repo"].split("/")[-1] # Getting the application name from the repo
digest_number = release["digest"]
app_url = release["html_url"] # Getting the application URL
release_date = release["published_at"] # Getting the release date
release_date = release_date.replace("T", " ").replace("Z", "") # Formatting the release date
# Checking if the version has changed since the last time
cursor.execute(
"SELECT digest FROM docker_versions WHERE repo=?",
(app_name,),
)
previous_digest = cursor.fetchone()
if previous_digest and previous_digest[0] == digest_number:
logger.info(f"The digest of {app_name} has not changed. No notification sent.")
continue # Move on to the next application
message = f"🐳 *Docker Image Updated!*\n\n🔐 *New Digest*: `{digest_number}`\n\n📦 *App*: {app_name}\n\n📢*Published*: {release_date}\n\n 🔗 *Release Url*: {app_url}"
# Updating the previous digest for this application
cursor.execute(
"INSERT OR REPLACE INTO docker_versions (repo, digest) VALUES (?, ?, ?)",
(app_name, digest_number),
)
conn.commit()
headers = {
"Authorization": f"Basic {auth}",
"Title": f"🆕 New version for {app_name}",
"Priority": "urgent",
"Markdown": "yes",
"Actions": f"View, Update {app_name}, {app_url}, clear=true",
}
response = requests.post(f"{url}", headers=headers, data=message)
if response.status_code == 200:
logger.info(f"Message sent to Ntfy for {app_name}")
continue
else:
logger.error(f"Failed to send message to Ntfy. Status code: {response.status_code}")

View File

@@ -1,131 +0,0 @@
import requests
import sqlite3
import logging
logging.basicConfig(
level=logging.INFO,
format="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
)
logger = logging.getLogger(__name__)
def get_db_connection():
return sqlite3.connect("/github-ntfy/ghntfy_versions.db", check_same_thread=False)
def github_send_to_slack(releases, webhook_url):
conn = get_db_connection()
cursor = conn.cursor()
for release in releases:
app_name = release["repo"].split("/")[-1]
version_number = release["tag_name"]
app_url = release["html_url"]
changelog = release["changelog"]
release_date = release["published_at"].replace("T", " ").replace("Z", "")
cursor.execute("SELECT version FROM versions WHERE repo=?", (app_name,))
previous_version = cursor.fetchone()
if previous_version and previous_version[0] == version_number:
logger.info(f"The version of {app_name} has not changed. No notification sent.")
continue
message = f"📌 *New version*: {version_number}\n\n📦*For*: {app_name}\n\n📅 *Published on*: {release_date}\n\n📝 *Changelog*:\n\n```{changelog}```"
if len(message) > 2000:
message = f"📌 *New version*: {version_number}\n\n📦*For*: {app_name}\n\n📅 *Published on*: {release_date}\n\n📝 *Changelog*:\n\n `truncated..` use 🔗 instead "
cursor.execute(
"INSERT OR REPLACE INTO versions (repo, version, changelog) VALUES (?, ?, ?)",
(app_name, version_number, changelog),
)
conn.commit()
message = {
"blocks": [
{
"type": "section",
"text": {
"type": "mrkdwn",
"text": f"{message}"
},
"accessory": {
"type": "button",
"text": {
"type": "plain_text",
"text": "🔗 Release Url"
},
"url": f"{app_url}",
"action_id": "button-action"
}
},
{
"type": "divider"
}
]
}
headers = {
"Content-Type": "application/json"
}
response = requests.post(webhook_url, json=message, headers=headers)
if response.status_code == 200:
logger.info(f"Message sent to Slack for {app_name}")
else:
logger.error(f"Failed to send message to Slack. Status code: {response.status_code}")
logger.error(f"Response: {response.text}")
conn.close()
def docker_send_to_slack(releases, webhook_url):
conn = get_db_connection()
cursor = conn.cursor()
for release in releases:
app_name = release["repo"].split("/")[-1]
digest_number = release["digest"]
app_url = release["html_url"]
release_date = release["published_at"].replace("T", " ").replace("Z", "")
cursor.execute("SELECT digest FROM docker_versions WHERE repo=?", (app_name,))
previous_digest = cursor.fetchone()
if previous_digest and previous_digest[0] == digest_number:
logger.info(f"The digest of {app_name} has not changed. No notification sent.")
continue
message = f"🐳 *Docker Image Updated!*\n\n🔐 *New Digest*: `{digest_number}`\n\n📦 *App*: {app_name}\n\n📢*Published*: {release_date}"
cursor.execute(
"INSERT OR REPLACE INTO docker_versions (repo, digest) VALUES (?, ?)",
(app_name, digest_number),
)
conn.commit()
message = {
"blocks": [
{
"type": "section",
"text": {
"type": "mrkdwn",
"text": f"{message}"
},
"accessory": {
"type": "button",
"text": {
"type": "plain_text",
"text": "🔗 Release Url"
},
"url": f"{app_url}",
"action_id": "button-action"
}
},
{
"type": "divider"
}
]
}
headers = {
"Content-Type": "application/json"
}
response = requests.post(webhook_url, json=message, headers=headers)
if 200 <= response.status_code < 300:
logger.info(f"Message sent to Slack for {app_name}")
else:
logger.error(f"Failed to send message to Slack. Status code: {response.status_code}")
logger.error(f"Response: {response.text}")
conn.close()

874
src/api.rs Normal file
View File

@@ -0,0 +1,874 @@
use log::{error, info};
use rusqlite::{Connection, params};
use serde_json::json;
use std::env;
use std::sync::Arc;
use tokio::sync::Mutex;
use warp::{Filter, Reply, Rejection};
use warp::http::StatusCode;
use serde::{Serialize, Deserialize};
use chrono::Utc;
use crate::database::{
get_user_by_username, verify_password, create_user, create_session,
get_session, delete_session, get_app_settings, update_app_settings
};
use crate::models::{UserLogin, UserRegistration, AuthResponse, ApiResponse, AppSettings};
#[derive(Debug, Serialize, Deserialize)]
struct RepoRequest {
repo: String,
}
#[derive(Debug, Serialize, Deserialize)]
struct UpdateInfo {
date: String,
repo: String,
version: String,
changelog: String,
}
pub async fn start_api() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
// Open the database
let db_path = env::var("DB_PATH").unwrap_or_else(|_| "/github-ntfy".to_string());
std::fs::create_dir_all(&db_path).ok();
let repos_path = format!("{}/watched_repos.db", db_path);
let versions_path = format!("{}/ghntfy_versions.db", db_path);
match Connection::open(&repos_path) {
Ok(conn) => {
info!("Database connection established successfully");
let db = Arc::new(Mutex::new(conn));
let versions_conn = match Connection::open(&versions_path) {
Ok(c) => c,
Err(e) => {
error!("Unable to open versions database: {}", e);
return Err(Box::new(e));
}
};
let versions_db = Arc::new(Mutex::new(versions_conn));
// Route definitions
let add_github = warp::path("app_github_repo")
.and(warp::post())
.and(warp::body::json())
.and(with_db(db.clone()))
.and_then(add_github_repo);
let add_docker = warp::path("app_docker_repo")
.and(warp::post())
.and(warp::body::json())
.and(with_db(db.clone()))
.and_then(add_docker_repo);
let get_github = warp::path("watched_repos")
.and(warp::get())
.and(with_db(db.clone()))
.and_then(get_github_repos);
let get_docker = warp::path("watched_docker_repos")
.and(warp::get())
.and(with_db(db.clone()))
.and_then(get_docker_repos);
let delete_github = warp::path("delete_repo")
.and(warp::post())
.and(warp::body::json())
.and(with_db(db.clone()))
.and_then(delete_github_repo);
let delete_docker = warp::path("delete_docker_repo")
.and(warp::post())
.and(warp::body::json())
.and(with_db(db.clone()))
.and_then(delete_docker_repo);
let get_updates = warp::path("latest_updates")
.and(warp::get())
.and(with_db(db.clone()))
.and_then(get_latest_updates);
let login_route = warp::path("auth")
.and(warp::path("login"))
.and(warp::post())
.and(warp::body::json())
.and(with_db(versions_db.clone()))
.and_then(login);
let register_route = warp::path("auth")
.and(warp::path("register"))
.and(warp::post())
.and(warp::body::json())
.and(with_db(versions_db.clone()))
.and_then(register);
let logout_route = warp::path("auth")
.and(warp::path("logout"))
.and(warp::post())
.and(with_auth())
.and(with_db(versions_db.clone()))
.and_then(logout);
let get_settings_route = warp::path("settings")
.and(warp::get())
.and(with_db(versions_db.clone()))
.and(with_auth())
.and_then(get_settings);
let update_settings_route = warp::path("settings")
.and(warp::put())
.and(warp::body::json())
.and(with_db(versions_db.clone()))
.and(with_auth())
.and_then(update_settings);
let is_configured_route = warp::path("is_configured")
.and(warp::get())
.and(with_db(versions_db.clone()))
.and_then(is_configured);
// Configure CORS
let cors = warp::cors()
.allow_any_origin()
.allow_headers(vec!["Content-Type", "Authorization"])
.allow_methods(vec!["GET", "POST", "PUT", "DELETE"]);
// Combine all routes with CORS
let routes = add_github
.or(add_docker)
.or(get_github)
.or(get_docker)
.or(delete_github)
.or(delete_docker)
.or(get_updates)
.or(login_route)
.or(register_route)
.or(logout_route)
.or(get_settings_route)
.or(update_settings_route)
.or(is_configured_route)
.with(cors);
// Start the server
info!("Starting API on 0.0.0.0:5000");
warp::serve(routes).run(([0, 0, 0, 0], 5000)).await;
Ok(())
},
Err(e) => {
error!("Unable to open database: {}", e);
Err(Box::new(e))
}
}
}
fn with_db(db: Arc<Mutex<Connection>>) -> impl Filter<Extract = (Arc<Mutex<Connection>>,), Error = std::convert::Infallible> + Clone {
warp::any().map(move || db.clone())
}
fn with_auth() -> impl Filter<Extract = (String,), Error = warp::Rejection> + Clone {
warp::header::<String>("Authorization")
.map(|header: String| {
if header.starts_with("Bearer ") {
header[7..].to_string()
} else {
header
}
})
.or_else(|_| async {
Err(warp::reject::custom(AuthError::MissingToken))
})
}
#[derive(Debug)]
enum AuthError {
MissingToken,
}
impl warp::reject::Reject for AuthError {}
async fn add_github_repo(body: RepoRequest, db: Arc<Mutex<Connection>>) -> Result<impl Reply, Rejection> {
let repo = body.repo;
if repo.is_empty() {
return Ok(warp::reply::with_status(
warp::reply::json(&json!({"error": "The 'repo' field is required."})),
StatusCode::BAD_REQUEST
));
}
let db_guard = db.lock().await;
// Check if repository already exists
match db_guard.query_row(
"SELECT COUNT(*) FROM watched_repos WHERE repo = ?",
params![repo],
|row| row.get::<_, i64>(0)
) {
Ok(count) if count > 0 => {
return Ok(warp::reply::with_status(
warp::reply::json(&json!({"error": format!("GitHub repository {} is already in the database.", repo)})),
StatusCode::CONFLICT
));
},
Err(e) => {
error!("Error while checking repository: {}", e);
return Ok(warp::reply::with_status(
warp::reply::json(&json!({"error": "An internal server error occurred."})),
StatusCode::INTERNAL_SERVER_ERROR
));
},
_ => {}
}
// Add the repository
match db_guard.execute("INSERT INTO watched_repos (repo) VALUES (?)", params![repo]) {
Ok(_) => {
Ok(warp::reply::with_status(
warp::reply::json(&json!({"message": format!("GitHub repository {} has been added to watched repositories.", repo)})),
StatusCode::OK
))
},
Err(e) => {
error!("Error while adding repository: {}", e);
Ok(warp::reply::with_status(
warp::reply::json(&json!({"error": format!("Database error: {}", e)})),
StatusCode::INTERNAL_SERVER_ERROR
))
}
}
}
async fn add_docker_repo(body: RepoRequest, db: Arc<Mutex<Connection>>) -> Result<impl Reply, Rejection> {
let repo = body.repo;
if repo.is_empty() {
return Ok(warp::reply::with_status(
warp::reply::json(&json!({"error": "The 'repo' field is required."})),
StatusCode::BAD_REQUEST
));
}
let db_guard = db.lock().await;
// Check if repository already exists
match db_guard.query_row(
"SELECT COUNT(*) FROM docker_watched_repos WHERE repo = ?",
params![repo],
|row| row.get::<_, i64>(0)
) {
Ok(count) if count > 0 => {
return Ok(warp::reply::with_status(
warp::reply::json(&json!({"error": format!("Docker repository {} is already in the database.", repo)})),
StatusCode::CONFLICT
));
},
Err(e) => {
error!("Error while checking repository: {}", e);
return Ok(warp::reply::with_status(
warp::reply::json(&json!({"error": format!("Database error: {}", e)})),
StatusCode::INTERNAL_SERVER_ERROR
));
},
_ => {}
}
// Add the repository
match db_guard.execute("INSERT INTO docker_watched_repos (repo) VALUES (?)", params![repo]) {
Ok(_) => {
Ok(warp::reply::with_status(
warp::reply::json(&json!({"message": format!("Docker repository {} has been added to watched repositories.", repo)})),
StatusCode::OK
))
},
Err(e) => {
error!("Error while adding repository: {}", e);
Ok(warp::reply::with_status(
warp::reply::json(&json!({"error": format!("Database error: {}", e)})),
StatusCode::INTERNAL_SERVER_ERROR
))
}
}
}
async fn get_github_repos(db: Arc<Mutex<Connection>>) -> Result<impl Reply, Rejection> {
// Solution: collect all results inside the locked block
let repos = {
let db_guard = db.lock().await;
let mut stmt = match db_guard.prepare("SELECT repo FROM watched_repos") {
Ok(stmt) => stmt,
Err(e) => {
error!("Error while preparing query: {}", e);
return Ok(warp::reply::with_status(
warp::reply::json(&json!({"error": format!("Database error: {}", e)})),
StatusCode::INTERNAL_SERVER_ERROR
));
}
};
let rows = match stmt.query_map([], |row| row.get::<_, String>(0)) {
Ok(rows) => rows,
Err(e) => {
error!("Error while executing query: {}", e);
return Ok(warp::reply::with_status(
warp::reply::json(&json!({"error": format!("Database error: {}", e)})),
StatusCode::INTERNAL_SERVER_ERROR
));
}
};
let mut repos = Vec::new();
for row in rows {
if let Ok(repo) = row {
repos.push(repo);
}
}
repos
}; // Lock is released here
Ok(warp::reply::with_status(
warp::reply::json(&repos),
StatusCode::OK
))
}
async fn get_docker_repos(db: Arc<Mutex<Connection>>) -> Result<impl Reply, Rejection> {
// Solution: collect all results inside the locked block
let repos = {
let db_guard = db.lock().await;
let mut stmt = match db_guard.prepare("SELECT repo FROM docker_watched_repos") {
Ok(stmt) => stmt,
Err(e) => {
error!("Error while preparing query: {}", e);
return Ok(warp::reply::with_status(
warp::reply::json(&json!({"error": format!("Database error: {}", e)})),
StatusCode::INTERNAL_SERVER_ERROR
));
}
};
let rows = match stmt.query_map([], |row| row.get::<_, String>(0)) {
Ok(rows) => rows,
Err(e) => {
error!("Error while executing query: {}", e);
return Ok(warp::reply::with_status(
warp::reply::json(&json!({"error": format!("Database error: {}", e)})),
StatusCode::INTERNAL_SERVER_ERROR
));
}
};
let mut repos = Vec::new();
for row in rows {
if let Ok(repo) = row {
repos.push(repo);
}
}
repos
}; // Lock is released here
Ok(warp::reply::with_status(
warp::reply::json(&repos),
StatusCode::OK
))
}
async fn delete_github_repo(body: RepoRequest, db: Arc<Mutex<Connection>>) -> Result<impl Reply, Rejection> {
let repo = body.repo;
if repo.is_empty() {
return Ok(warp::reply::with_status(
warp::reply::json(&json!({"error": "The 'repo' field is required."})),
StatusCode::BAD_REQUEST
));
}
let db_guard = db.lock().await;
// Check if repository exists
match db_guard.query_row(
"SELECT COUNT(*) FROM watched_repos WHERE repo = ?",
params![repo],
|row| row.get::<_, i64>(0)
) {
Ok(count) if count == 0 => {
return Ok(warp::reply::with_status(
warp::reply::json(&json!({"error": format!("GitHub repository {} is not in the database.", repo)})),
StatusCode::NOT_FOUND
));
},
Err(e) => {
error!("Error while checking repository: {}", e);
return Ok(warp::reply::with_status(
warp::reply::json(&json!({"error": format!("Database error: {}", e)})),
StatusCode::INTERNAL_SERVER_ERROR
));
},
_ => {}
}
// Delete the repository
match db_guard.execute("DELETE FROM watched_repos WHERE repo = ?", params![repo]) {
Ok(_) => {
Ok(warp::reply::with_status(
warp::reply::json(&json!({"message": format!("GitHub repository {} has been removed from watched repositories.", repo)})),
StatusCode::OK
))
},
Err(e) => {
error!("Error while deleting repository: {}", e);
Ok(warp::reply::with_status(
warp::reply::json(&json!({"error": format!("Database error: {}", e)})),
StatusCode::INTERNAL_SERVER_ERROR
))
}
}
}
async fn delete_docker_repo(body: RepoRequest, db: Arc<Mutex<Connection>>) -> Result<impl Reply, Rejection> {
let repo = body.repo;
if repo.is_empty() {
return Ok(warp::reply::with_status(
warp::reply::json(&json!({"error": "The 'repo' field is required."})),
StatusCode::BAD_REQUEST
));
}
let db_guard = db.lock().await;
// Check if repository exists
match db_guard.query_row(
"SELECT COUNT(*) FROM docker_watched_repos WHERE repo = ?",
params![repo],
|row| row.get::<_, i64>(0)
) {
Ok(count) if count == 0 => {
return Ok(warp::reply::with_status(
warp::reply::json(&json!({"error": format!("Docker repository {} is not in the database.", repo)})),
StatusCode::NOT_FOUND
));
},
Err(e) => {
error!("Error while checking repository: {}", e);
return Ok(warp::reply::with_status(
warp::reply::json(&json!({"error": format!("Database error: {}", e)})),
StatusCode::INTERNAL_SERVER_ERROR
));
},
_ => {}
}
// Delete the repository
match db_guard.execute("DELETE FROM docker_watched_repos WHERE repo = ?", params![repo]) {
Ok(_) => {
Ok(warp::reply::with_status(
warp::reply::json(&json!({"message": format!("Docker repository {} has been removed from watched repositories.", repo)})),
StatusCode::OK
))
},
Err(e) => {
error!("Error while deleting repository: {}", e);
Ok(warp::reply::with_status(
warp::reply::json(&json!({"error": format!("Database error: {}", e)})),
StatusCode::INTERNAL_SERVER_ERROR
))
}
}
}
async fn get_latest_updates(db: Arc<Mutex<Connection>>) -> Result<impl Reply, Rejection> {
let updates = {
let _db_guard = db.lock().await;
let db_path = env::var("DB_PATH").unwrap_or_else(|_| "/github-ntfy".to_string());
let versions_path = format!("{}/ghntfy_versions.db", db_path);
match Connection::open(&versions_path) {
Ok(versions_db) => {
match versions_db.prepare("SELECT repo, version, changelog, datetime('now') as date FROM versions ORDER BY rowid DESC LIMIT 5") {
Ok(mut stmt) => {
let rows = match stmt.query_map([], |row| {
Ok(UpdateInfo {
repo: row.get(0)?,
version: row.get(1)?,
changelog: row.get(2)?,
date: row.get(3)?,
})
}) {
Ok(rows) => rows,
Err(e) => {
error!("Error executing query: {}", e);
return Ok(warp::reply::with_status(
warp::reply::json(&json!({"error": format!("Database error: {}", e)})),
StatusCode::INTERNAL_SERVER_ERROR
));
}
};
let mut updates = Vec::new();
for row in rows {
if let Ok(update) = row {
updates.push(update);
}
}
if updates.is_empty() {
vec![
UpdateInfo {
date: Utc::now().to_rfc3339(),
repo: "BreizhHardware/ntfy_alerts".to_string(),
version: "2.0.2".to_string(),
changelog: "- Aucune mise à jour trouvée dans la base de données\n- Ceci est une donnée d'exemple".to_string(),
}
]
} else {
updates
}
},
Err(e) => {
error!("Error preparing query: {}", e);
vec![
UpdateInfo {
date: Utc::now().to_rfc3339(),
repo: "Erreur".to_string(),
version: "N/A".to_string(),
changelog: format!("- Erreur lors de la préparation de la requête: {}", e),
}
]
}
}
},
Err(e) => {
error!("Error opening versions database: {}", e);
vec![
UpdateInfo {
date: Utc::now().to_rfc3339(),
repo: "Erreur".to_string(),
version: "N/A".to_string(),
changelog: format!("- Erreur lors de l'ouverture de la base de données: {}", e),
}
]
}
}
};
Ok(warp::reply::with_status(
warp::reply::json(&updates),
StatusCode::OK
))
}
async fn login(login: UserLogin, db: Arc<Mutex<Connection>>) -> Result<impl Reply, Rejection> {
let conn = db.lock().await;
match verify_password(&conn, &login.username, &login.password) {
Ok(true) => {
if let Ok(Some(user)) = get_user_by_username(&conn, &login.username) {
if let Ok(token) = create_session(&conn, user.id) {
let auth_response = AuthResponse {
token,
user: user.clone(),
};
Ok(warp::reply::with_status(
warp::reply::json(&ApiResponse {
success: true,
message: "Login successful".to_string(),
data: Some(auth_response),
}),
StatusCode::OK,
))
} else {
Ok(warp::reply::with_status(
warp::reply::json(&ApiResponse::<()> {
success: false,
message: "Error creating session".to_string(),
data: None,
}),
StatusCode::INTERNAL_SERVER_ERROR,
))
}
} else {
Ok(warp::reply::with_status(
warp::reply::json(&ApiResponse::<()> {
success: false,
message: "User not found".to_string(),
data: None,
}),
StatusCode::NOT_FOUND,
))
}
},
Ok(false) => {
Ok(warp::reply::with_status(
warp::reply::json(&ApiResponse::<()> {
success: false,
message: "Incorrect username or password".to_string(),
data: None,
}),
StatusCode::UNAUTHORIZED,
))
},
Err(_) => {
Ok(warp::reply::with_status(
warp::reply::json(&ApiResponse::<()> {
success: false,
message: "Internal server error".to_string(),
data: None,
}),
StatusCode::INTERNAL_SERVER_ERROR,
))
}
}
}
async fn register(registration: UserRegistration, db: Arc<Mutex<Connection>>) -> Result<impl Reply, Rejection> {
let conn = db.lock().await;
// Check if a user already exists with this username
if let Ok(Some(_)) = get_user_by_username(&conn, &registration.username) {
return Ok(warp::reply::with_status(
warp::reply::json(&ApiResponse::<()> {
success: false,
message: "A user with this name already exists".to_string(),
data: None,
}),
StatusCode::CONFLICT,
));
}
// Create the new user
match create_user(&conn, &registration.username, &registration.password, registration.is_admin) {
Ok(user_id) => {
if let Ok(Some(user)) = get_user_by_username(&conn, &registration.username) {
if let Ok(token) = create_session(&conn, user_id) {
let auth_response = AuthResponse {
token,
user,
};
Ok(warp::reply::with_status(
warp::reply::json(&ApiResponse {
success: true,
message: "Registration successful".to_string(),
data: Some(auth_response),
}),
StatusCode::CREATED,
))
} else {
Ok(warp::reply::with_status(
warp::reply::json(&ApiResponse::<()> {
success: false,
message: "Error creating session".to_string(),
data: None,
}),
StatusCode::INTERNAL_SERVER_ERROR,
))
}
} else {
Ok(warp::reply::with_status(
warp::reply::json(&ApiResponse::<()> {
success: false,
message: "Error retrieving user".to_string(),
data: None,
}),
StatusCode::INTERNAL_SERVER_ERROR,
))
}
},
Err(_) => {
Ok(warp::reply::with_status(
warp::reply::json(&ApiResponse::<()> {
success: false,
message: "Error creating user".to_string(),
data: None,
}),
StatusCode::INTERNAL_SERVER_ERROR,
))
}
}
}
async fn logout(token: String, db: Arc<Mutex<Connection>>) -> Result<impl Reply, Rejection> {
let conn = db.lock().await;
match delete_session(&conn, &token) {
Ok(_) => {
Ok(warp::reply::with_status(
warp::reply::json(&ApiResponse::<()> {
success: true,
message: "Logout successful".to_string(),
data: None,
}),
StatusCode::OK,
))
},
Err(_) => {
Ok(warp::reply::with_status(
warp::reply::json(&ApiResponse::<()> {
success: false,
message: "Error during logout".to_string(),
data: None,
}),
StatusCode::INTERNAL_SERVER_ERROR,
))
}
}
}
async fn get_settings(db: Arc<Mutex<Connection>>, token: String) -> Result<impl Reply, Rejection> {
let conn = db.lock().await;
// Verify authentication
if let Ok(Some(session)) = get_session(&conn, &token) {
if session.expires_at < Utc::now() {
return Ok(warp::reply::with_status(
warp::reply::json(&ApiResponse::<()> {
success: false,
message: "Session expired".to_string(),
data: None,
}),
StatusCode::UNAUTHORIZED,
));
}
// Retrieve settings
match get_app_settings(&conn) {
Ok(Some(settings)) => {
Ok(warp::reply::with_status(
warp::reply::json(&ApiResponse {
success: true,
message: "Settings retrieved successfully".to_string(),
data: Some(settings),
}),
StatusCode::OK,
))
},
Ok(None) => {
Ok(warp::reply::with_status(
warp::reply::json(&ApiResponse::<()> {
success: false,
message: "No settings found".to_string(),
data: None,
}),
StatusCode::NOT_FOUND,
))
},
Err(_) => {
Ok(warp::reply::with_status(
warp::reply::json(&ApiResponse::<()> {
success: false,
message: "Error retrieving settings".to_string(),
data: None,
}),
StatusCode::INTERNAL_SERVER_ERROR,
))
}
}
} else {
Ok(warp::reply::with_status(
warp::reply::json(&ApiResponse::<()> {
success: false,
message: "Unauthorized".to_string(),
data: None,
}),
StatusCode::UNAUTHORIZED,
))
}
}
async fn update_settings(settings: AppSettings, db: Arc<Mutex<Connection>>, token: String) -> Result<impl Reply, Rejection> {
let conn = db.lock().await;
// Verify authentication
if let Ok(Some(session)) = get_session(&conn, &token) {
if session.expires_at < Utc::now() {
return Ok(warp::reply::with_status(
warp::reply::json(&ApiResponse::<()> {
success: false,
message: "Session expired".to_string(),
data: None,
}),
StatusCode::UNAUTHORIZED,
));
}
// Update settings
match update_app_settings(&conn, &settings) {
Ok(_) => {
Ok(warp::reply::with_status(
warp::reply::json(&ApiResponse::<()> {
success: true,
message: "Settings updated successfully".to_string(),
data: None,
}),
StatusCode::OK,
))
},
Err(_) => {
Ok(warp::reply::with_status(
warp::reply::json(&ApiResponse::<()> {
success: false,
message: "Error updating settings".to_string(),
data: None,
}),
StatusCode::INTERNAL_SERVER_ERROR,
))
}
}
} else {
Ok(warp::reply::with_status(
warp::reply::json(&ApiResponse::<()> {
success: false,
message: "Unauthorized".to_string(),
data: None,
}),
StatusCode::UNAUTHORIZED,
))
}
}
// Function to check if the application is configured
async fn is_configured(db: Arc<Mutex<Connection>>) -> Result<impl Reply, Rejection> {
let conn = db.lock().await;
// Check if at least one admin user exists
let admin_exists = match conn.query_row(
"SELECT COUNT(*) FROM users WHERE is_admin = 1",
[],
|row| row.get::<_, i64>(0)
) {
Ok(count) => count > 0,
Err(_) => false,
};
// Check if settings are configured
let settings_exist = match get_app_settings(&conn) {
Ok(Some(settings)) => {
// Check if at least one notification service is configured
settings.ntfy_url.is_some() ||
settings.discord_webhook_url.is_some() ||
settings.slack_webhook_url.is_some() ||
settings.gotify_url.is_some()
},
_ => false,
};
Ok(warp::reply::with_status(
warp::reply::json(&ApiResponse {
success: true,
message: "Configuration status retrieved".to_string(),
data: Some(json!({
"configured": admin_exists && settings_exist,
"admin_exists": admin_exists,
"settings_exist": settings_exist
})),
}),
StatusCode::OK,
))
}

127
src/config.rs Normal file
View File

@@ -0,0 +1,127 @@
use dotenv::dotenv;
use log::info;
use reqwest::header::{HeaderMap, HeaderValue, AUTHORIZATION};
use std::env;
use std::fs::File;
use std::io::Read;
use rusqlite::Connection;
use crate::docker::create_dockerhub_token;
use crate::database::get_app_settings;
// Configuration
pub struct Config {
pub github_token: Option<String>,
pub docker_username: Option<String>,
pub docker_password: Option<String>,
pub docker_token: Option<String>,
pub ntfy_url: Option<String>,
pub gotify_url: Option<String>,
pub gotify_token: Option<String>,
pub discord_webhook_url: Option<String>,
pub slack_webhook_url: Option<String>,
pub auth: String,
pub timeout: f64,
}
impl Config {
pub fn from_env() -> Self {
dotenv().ok();
let docker_username = env::var("DOCKER_USERNAME").ok();
let docker_password = env::var("DOCKER_PASSWORD").ok();
let docker_token = if let (Some(username), Some(password)) = (&docker_username, &docker_password) {
create_dockerhub_token(username, password)
} else {
None
};
// Read authentication file
let mut auth = String::new();
if let Ok(mut file) = File::open("/auth.txt") {
file.read_to_string(&mut auth).ok();
auth = auth.trim().to_string();
}
Config {
github_token: env::var("GHNTFY_TOKEN").ok(),
docker_username,
docker_password,
docker_token,
ntfy_url: env::var("NTFY_URL").ok(),
gotify_url: env::var("GOTIFY_URL").ok(),
gotify_token: env::var("GOTIFY_TOKEN").ok(),
discord_webhook_url: env::var("DISCORD_WEBHOOK_URL").ok(),
slack_webhook_url: env::var("SLACK_WEBHOOK_URL").ok(),
auth,
timeout: env::var("GHNTFY_TIMEOUT")
.unwrap_or_else(|_| "3600".to_string())
.parse()
.unwrap_or(3600.0),
}
}
pub fn from_database(conn: &Connection) -> Self {
// First, try to load from database
if let Ok(Some(settings)) = get_app_settings(conn) {
let docker_username = settings.docker_username;
let docker_password = settings.docker_password.clone();
let docker_token = if let (Some(username), Some(password)) = (&docker_username, &docker_password) {
create_dockerhub_token(username, password)
} else {
None
};
// Read authentication file (for compatibility with the old system)
let mut auth = String::new();
if let Ok(mut file) = File::open("/auth.txt") {
file.read_to_string(&mut auth).ok();
auth = auth.trim().to_string();
}
let timeout = settings.check_interval.unwrap_or(3600) as f64;
info!("Configuration loaded from database");
return Config {
github_token: settings.github_token,
docker_username,
docker_password,
docker_token,
ntfy_url: settings.ntfy_url,
gotify_url: settings.gotify_url,
gotify_token: settings.gotify_token,
discord_webhook_url: settings.discord_webhook_url,
slack_webhook_url: settings.slack_webhook_url,
auth,
timeout,
};
}
// Fallback to environment variables if database is not available
info!("No configuration found in database, using environment variables");
Self::from_env()
}
pub fn github_headers(&self) -> HeaderMap {
let mut headers = HeaderMap::new();
if let Some(token) = &self.github_token {
headers.insert(
AUTHORIZATION,
HeaderValue::from_str(&format!("token {}", token)).unwrap(),
);
}
headers
}
pub fn docker_headers(&self) -> HeaderMap {
let mut headers = HeaderMap::new();
if let Some(token) = &self.docker_token {
headers.insert(
AUTHORIZATION,
HeaderValue::from_str(&format!("Bearer {}", token)).unwrap(),
);
}
headers
}
}

418
src/database.rs Normal file
View File

@@ -0,0 +1,418 @@
use log::info;
pub(crate) use rusqlite::{Connection, Result as SqliteResult, OpenFlags, Error as SqliteError};
use std::env;
use chrono::Utc;
use rand::Rng;
use bcrypt::{hash, verify, DEFAULT_COST};
use crate::models::{User, Session, AppSettings};
pub fn init_databases() -> SqliteResult<(Connection, Connection)> {
let db_path = env::var("DB_PATH").unwrap_or_else(|_| "/github-ntfy".to_string());
if let Err(e) = std::fs::create_dir_all(&db_path) {
info!("Error while creating directory {}: {}", db_path, e);
}
let versions_path = format!("{}/ghntfy_versions.db", db_path);
let repos_path = format!("{}/watched_repos.db", db_path);
let conn = Connection::open_with_flags(&versions_path, OpenFlags::SQLITE_OPEN_CREATE | OpenFlags::SQLITE_OPEN_READ_WRITE | OpenFlags::SQLITE_OPEN_URI)?;
info!("Database open at {}", versions_path);
conn.execute(
"CREATE TABLE IF NOT EXISTS versions (
repo TEXT PRIMARY KEY,
version TEXT,
changelog TEXT
)",
[],
)?;
conn.execute(
"CREATE TABLE IF NOT EXISTS docker_versions (
repo TEXT PRIMARY KEY,
digest TEXT
)",
[],
)?;
conn.execute(
"CREATE TABLE IF NOT EXISTS users (
id INTEGER PRIMARY KEY AUTOINCREMENT,
username TEXT UNIQUE NOT NULL,
password_hash TEXT NOT NULL,
is_admin INTEGER NOT NULL DEFAULT 0,
created_at TEXT NOT NULL
)",
[],
)?;
conn.execute(
"CREATE TABLE IF NOT EXISTS sessions (
token TEXT PRIMARY KEY,
user_id INTEGER NOT NULL,
expires_at TEXT NOT NULL,
FOREIGN KEY (user_id) REFERENCES users(id)
)",
[],
)?;
conn.execute(
"CREATE TABLE IF NOT EXISTS app_settings (
id INTEGER PRIMARY KEY CHECK (id = 1),
ntfy_url TEXT,
github_token TEXT,
docker_username TEXT,
docker_password TEXT,
gotify_url TEXT,
gotify_token TEXT,
discord_webhook_url TEXT,
slack_webhook_url TEXT,
check_interval INTEGER DEFAULT 3600,
auth TEXT,
last_updated TEXT NOT NULL
)",
[],
)?;
let admin_exists = conn
.query_row("SELECT COUNT(*) FROM users WHERE is_admin = 1", [], |row| {
row.get::<_, i64>(0)
})
.unwrap_or(0);
if admin_exists == 0 {
if let (Ok(username), Ok(password)) = (env::var("USERNAME"), env::var("PASSWORD")) {
if !username.is_empty() && !password.is_empty() {
let hashed_password = hash(password, DEFAULT_COST).unwrap_or_else(|_| String::new());
let now = Utc::now().to_rfc3339();
if let Err(e) = conn.execute(
"INSERT INTO users (username, password_hash, is_admin, created_at) VALUES (?, ?, 1, ?)",
&[&username, &hashed_password, &now],
) {
info!("Erreur lors de la création de l'utilisateur admin: {}", e);
} else {
info!("Utilisateur admin créé avec succès depuis les variables d'environnement");
}
}
}
}
let settings_exist = conn
.query_row("SELECT COUNT(*) FROM app_settings", [], |row| {
row.get::<_, i64>(0)
})
.unwrap_or(0);
if settings_exist == 0 {
let ntfy_url = env::var("NTFY_URL").ok();
let github_token = env::var("GHNTFY_TOKEN").ok();
let docker_username = env::var("DOCKER_USERNAME").ok();
let docker_password = env::var("DOCKER_PASSWORD").ok();
let gotify_url = env::var("GOTIFY_URL").ok();
let gotify_token = env::var("GOTIFY_TOKEN").ok();
let discord_webhook_url = env::var("DISCORD_WEBHOOK_URL").ok();
let slack_webhook_url = env::var("SLACK_WEBHOOK_URL").ok();
let check_interval = env::var("GHNTFY_TIMEOUT")
.ok()
.and_then(|s| s.parse::<i64>().ok())
.unwrap_or(3600);
let now = Utc::now().to_rfc3339();
if let Err(e) = conn.execute(
"INSERT INTO app_settings (id, ntfy_url, github_token, docker_username, docker_password, gotify_url, gotify_token, discord_webhook_url, slack_webhook_url, check_interval, last_updated)
VALUES (1, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)",
rusqlite::params![
ntfy_url,
github_token,
docker_username,
docker_password,
gotify_url,
gotify_token,
discord_webhook_url,
slack_webhook_url,
check_interval,
now
],
) {
info!("Erreur lors de l'initialisation des paramètres: {}", e);
} else {
info!("Paramètres initialisés avec succès depuis les variables d'environnement");
}
}
let conn2 = Connection::open_with_flags(&repos_path, OpenFlags::SQLITE_OPEN_CREATE | OpenFlags::SQLITE_OPEN_READ_WRITE | OpenFlags::SQLITE_OPEN_URI)?;
info!("Database open at {}", repos_path);
conn2.execute(
"CREATE TABLE IF NOT EXISTS watched_repos (
id INTEGER PRIMARY KEY,
repo TEXT
)",
[],
)?;
conn2.execute(
"CREATE TABLE IF NOT EXISTS docker_watched_repos (
id INTEGER PRIMARY KEY,
repo TEXT
)",
[],
)?;
Ok((conn, conn2))
}
// Functions to retrieve watched repositories
pub fn get_watched_repos(conn: &Connection) -> SqliteResult<Vec<String>> {
let mut stmt = conn.prepare("SELECT repo FROM watched_repos")?;
let repos_iter = stmt.query_map([], |row| Ok(row.get::<_, String>(0)?))?;
let mut repos = Vec::new();
for repo in repos_iter {
repos.push(repo?);
}
Ok(repos)
}
pub fn get_docker_watched_repos(conn: &Connection) -> SqliteResult<Vec<String>> {
let mut stmt = conn.prepare("SELECT repo FROM docker_watched_repos")?;
let repos_iter = stmt.query_map([], |row| Ok(row.get::<_, String>(0)?))?;
let mut repos = Vec::new();
for repo in repos_iter {
repos.push(repo?);
}
Ok(repos)
}
pub fn is_new_version(conn: &Connection, repo: &str, version: &str) -> SqliteResult<bool> {
let mut stmt = conn.prepare("SELECT version FROM versions WHERE repo = ?")?;
let result = stmt.query_map([repo], |row| row.get::<_, String>(0))?;
for stored_version in result {
if let Ok(v) = stored_version {
return Ok(v != version);
}
}
Ok(true)
}
pub fn update_version(conn: &Connection, repo: &str, version: &str, changelog: Option<&str>) -> SqliteResult<()> {
conn.execute(
"REPLACE INTO versions (repo, version, changelog) VALUES (?, ?, ?)",
[repo, version, changelog.unwrap_or("")],
)?;
Ok(())
}
pub fn create_user(conn: &Connection, username: &str, password: &str, is_admin: bool) -> SqliteResult<i64> {
let hashed_password = hash(password, DEFAULT_COST).map_err(|e| {
SqliteError::SqliteFailure(
rusqlite::ffi::Error::new(1),
Some(e.to_string())
)
})?;
let now = Utc::now().to_rfc3339();
conn.execute(
"INSERT INTO users (username, password_hash, is_admin, created_at) VALUES (?, ?, ?, ?)",
&[username, &hashed_password, &(if is_admin { 1 } else { 0 }).to_string(), &now],
)?;
Ok(conn.last_insert_rowid())
}
pub fn get_user_by_username(conn: &Connection, username: &str) -> SqliteResult<Option<User>> {
let mut stmt = conn.prepare("SELECT id, username, password_hash, is_admin, created_at FROM users WHERE username = ?")?;
let mut rows = stmt.query(&[username])?;
if let Some(row) = rows.next()? {
let id = row.get(0)?;
let username = row.get(1)?;
let password_hash = row.get(2)?;
let is_admin: i64 = row.get(3)?;
let created_at_str: String = row.get(4)?;
let created_at = chrono::DateTime::parse_from_rfc3339(&created_at_str)
.map(|dt| dt.with_timezone(&Utc))
.map_err(|e| {
SqliteError::SqliteFailure(
rusqlite::ffi::Error::new(1),
Some(e.to_string())
)
})?;
Ok(Some(User {
id,
username,
password_hash,
is_admin: is_admin == 1,
created_at,
}))
} else {
Ok(None)
}
}
pub fn verify_password(conn: &Connection, username: &str, password: &str) -> SqliteResult<bool> {
if let Some(user) = get_user_by_username(conn, username)? {
Ok(verify(password, &user.password_hash).unwrap_or(false))
} else {
Ok(false)
}
}
pub fn create_session(conn: &Connection, user_id: i64) -> SqliteResult<String> {
let token = generate_session_token();
let expires_at = Utc::now() + chrono::Duration::days(7);
let expires_at_str = expires_at.to_rfc3339();
conn.execute(
"INSERT INTO sessions (token, user_id, expires_at) VALUES (?, ?, ?)",
&[&token, &user_id.to_string(), &expires_at_str],
)?;
Ok(token)
}
pub fn get_session(conn: &Connection, token: &str) -> SqliteResult<Option<Session>> {
let mut stmt = conn.prepare("SELECT token, user_id, expires_at FROM sessions WHERE token = ?")?;
let mut rows = stmt.query(&[token])?;
if let Some(row) = rows.next()? {
let token = row.get(0)?;
let user_id = row.get(1)?;
let expires_at_str: String = row.get(2)?;
let expires_at = chrono::DateTime::parse_from_rfc3339(&expires_at_str)
.map(|dt| dt.with_timezone(&Utc))
.map_err(|e| {
SqliteError::SqliteFailure(
rusqlite::ffi::Error::new(1),
Some(e.to_string())
)
})?;
Ok(Some(Session {
token,
user_id,
expires_at,
}))
} else {
Ok(None)
}
}
pub fn delete_session(conn: &Connection, token: &str) -> SqliteResult<()> {
conn.execute(
"DELETE FROM sessions WHERE token = ?",
&[token],
)?;
Ok(())
}
pub fn get_app_settings(conn: &Connection) -> SqliteResult<Option<AppSettings>> {
let mut stmt = conn.prepare(
"SELECT id, ntfy_url, github_token, docker_username, docker_password,
gotify_url, gotify_token, discord_webhook_url, slack_webhook_url,
check_interval, auth, last_updated
FROM app_settings
WHERE id = 1"
)?;
let mut rows = stmt.query([])?;
if let Some(row) = rows.next()? {
let id = row.get(0)?;
let ntfy_url = row.get(1)?;
let github_token = row.get(2)?;
let docker_username = row.get(3)?;
let docker_password = row.get(4)?;
let gotify_url = row.get(5)?;
let gotify_token = row.get(6)?;
let discord_webhook_url = row.get(7)?;
let slack_webhook_url = row.get(8)?;
let check_interval = row.get(9)?;
let auth = row.get(10)?;
let last_updated_str: String = row.get(11)?;
let last_updated = chrono::DateTime::parse_from_rfc3339(&last_updated_str)
.map(|dt| dt.with_timezone(&Utc))
.map_err(|e| {
SqliteError::SqliteFailure(
rusqlite::ffi::Error::new(1),
Some(e.to_string())
)
})?;
Ok(Some(AppSettings {
id: Some(id),
ntfy_url,
github_token,
docker_username,
docker_password,
gotify_url,
gotify_token,
discord_webhook_url,
slack_webhook_url,
check_interval,
auth,
last_updated,
}))
} else {
Ok(None)
}
}
pub fn update_app_settings(conn: &Connection, settings: &AppSettings) -> SqliteResult<()> {
let now = Utc::now().to_rfc3339();
conn.execute(
"UPDATE app_settings
SET ntfy_url = ?, github_token = ?, docker_username = ?, docker_password = ?,
gotify_url = ?, gotify_token = ?, discord_webhook_url = ?, slack_webhook_url = ?,
check_interval = ?, auth = ?, last_updated = ?
WHERE id = 1",
rusqlite::params![
settings.ntfy_url,
settings.github_token,
settings.docker_username,
settings.docker_password,
settings.gotify_url,
settings.gotify_token,
settings.discord_webhook_url,
settings.slack_webhook_url,
settings.check_interval,
settings.auth,
now
],
)?;
// If auth credentials are provided, write them to the auth.txt file
if let Some(auth) = &settings.auth {
if !auth.is_empty() {
if let Err(e) = std::fs::write("/auth.txt", auth) {
log::error!("Error writing to auth.txt file: {}", e);
} else {
log::info!("Successfully updated auth.txt file");
}
}
}
Ok(())
}
fn generate_session_token() -> String {
let mut rng = rand::thread_rng();
let token_bytes: Vec<u8> = (0..32).map(|_| rng.gen::<u8>()).collect();
// Convertir en hexadécimal
token_bytes.iter()
.map(|b| format!("{:02x}", b))
.collect::<Vec<String>>()
.join("")
}

73
src/docker.rs Normal file
View File

@@ -0,0 +1,73 @@
use log::error;
use reqwest::header::{HeaderMap, HeaderValue, CONTENT_TYPE};
use serde_json::json;
use crate::models::{DockerTag, DockerReleaseInfo};
pub fn create_dockerhub_token(username: &str, password: &str) -> Option<String> {
let client = reqwest::blocking::Client::new();
let mut headers = HeaderMap::new();
headers.insert(
CONTENT_TYPE,
HeaderValue::from_static("application/json"),
);
let data = json!({
"username": username,
"password": password
});
match client
.post("https://hub.docker.com/v2/users/login")
.headers(headers)
.json(&data)
.send()
{
Ok(response) => {
let status = response.status();
if status.is_success() {
if let Ok(json) = response.json::<serde_json::Value>() {
return json["token"].as_str().map(|s| s.to_string());
}
}
error!("DockerHub authentication failed: {}", status);
None
}
Err(e) => {
error!("Error connecting to DockerHub: {}", e);
None
}
}
}
pub async fn get_latest_docker_releases(
repos: &[String],
client: &reqwest::Client,
headers: HeaderMap,
) -> Vec<DockerReleaseInfo> {
let mut releases = Vec::new();
for repo in repos {
let url = format!("https://hub.docker.com/v2/repositories/{}/tags/latest", repo);
match client.get(&url).headers(headers.clone()).send().await {
Ok(response) => {
if response.status().is_success() {
if let Ok(tag) = response.json::<DockerTag>().await {
releases.push(DockerReleaseInfo {
repo: repo.clone(),
digest: tag.digest.clone(),
html_url: format!("https://hub.docker.com/r/{}", repo),
published_at: tag.last_updated,
});
}
} else {
error!("Error fetching Docker tag for {}: {}", repo, response.status());
}
}
Err(e) => {
error!("Error fetching Docker tag for {}: {}", repo, e);
}
}
}
releases
}

80
src/github.rs Normal file
View File

@@ -0,0 +1,80 @@
use log::{error, info};
use reqwest::header::HeaderMap;
use crate::models::{GithubRelease, GithubReleaseInfo};
pub async fn get_latest_releases(
repos: &[String],
client: &reqwest::Client,
mut headers: HeaderMap
) -> Vec<GithubReleaseInfo> {
let mut releases = Vec::new();
if !headers.contains_key("User-Agent") {
headers.insert("User-Agent", "github-ntfy/1.0".parse().unwrap());
}
let has_auth = headers.contains_key("Authorization");
if !has_auth {
info!("Aucun token GitHub configuré, les requêtes seront limitées");
}
for repo in repos {
let url = format!("https://api.github.com/repos/{}/releases/latest", repo);
match client.get(&url).headers(headers.clone()).send().await {
Ok(response) => {
if response.status().is_success() {
if let Ok(release) = response.json::<GithubRelease>().await {
let changelog = get_changelog(repo, client, headers.clone()).await;
releases.push(GithubReleaseInfo {
repo: repo.clone(),
name: release.name,
tag_name: release.tag_name,
html_url: release.html_url,
changelog,
published_at: release.published_at.unwrap_or_else(|| "Unknown date".to_string()),
});
}
} else {
let status = response.status();
let body = response.text().await.unwrap_or_default();
error!("Erreur lors de la récupération de la release GitHub pour {}: {} - {}",
repo, status, body);
}
},
Err(e) => {
error!("Erreur de connexion pour {}: {}", repo, e);
}
}
}
releases
}
pub async fn get_changelog(
repo: &str,
client: &reqwest::Client,
headers: HeaderMap,
) -> String {
let url = format!("https://api.github.com/repos/{}/releases", repo);
match client.get(&url).headers(headers).send().await {
Ok(response) => {
if response.status().is_success() {
if let Ok(releases) = response.json::<Vec<GithubRelease>>().await {
if !releases.is_empty() {
if let Some(body) = &releases[0].body {
return body.clone();
}
}
}
}
}
Err(e) => {
error!("Error retrieving changelog for {}: {}", repo, e);
}
}
"Changelog not available".to_string()
}

157
src/main.rs Normal file
View File

@@ -0,0 +1,157 @@
mod config;
mod models;
mod database;
mod github;
mod docker;
mod notifications;
mod api;
use log::{error, info};
use std::thread;
use std::time::Duration;
// Function to start the API in a separate thread
fn start_api() {
std::thread::spawn(|| {
let runtime = tokio::runtime::Runtime::new().unwrap();
runtime.block_on(async {
match api::start_api().await {
Ok(_) => info!("API closed correctly"),
Err(e) => error!("API error: {}", e),
}
});
});
}
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
env_logger::init();
// Initialize databases
let (conn_versions, conn_repos) = database::init_databases()?;
// Load environment variables
let env_config = config::Config::from_env();
// Only update database with env vars if they are explicitly set
// We check each field individually instead of overwriting everything
let has_env_notification = env_config.ntfy_url.is_some() ||
env_config.gotify_url.is_some() ||
env_config.discord_webhook_url.is_some() ||
env_config.slack_webhook_url.is_some();
if has_env_notification {
let now = chrono::Utc::now().to_rfc3339();
// First, ensure there's a record in the database
conn_versions.execute(
"INSERT OR IGNORE INTO app_settings (id, last_updated) VALUES (1, ?)",
rusqlite::params![now],
).map_err(|e| error!("Failed to initialize app settings: {}", e)).ok();
// Then update only the fields that are set in environment variables
if let Some(ntfy_url) = &env_config.ntfy_url {
conn_versions.execute(
"UPDATE app_settings SET ntfy_url = ?, last_updated = ? WHERE id = 1",
rusqlite::params![ntfy_url, now],
).ok();
}
if let Some(github_token) = &env_config.github_token {
conn_versions.execute(
"UPDATE app_settings SET github_token = ?, last_updated = ? WHERE id = 1",
rusqlite::params![github_token, now],
).ok();
}
if let Some(docker_username) = &env_config.docker_username {
conn_versions.execute(
"UPDATE app_settings SET docker_username = ?, last_updated = ? WHERE id = 1",
rusqlite::params![docker_username, now],
).ok();
}
if let Some(docker_password) = &env_config.docker_password {
conn_versions.execute(
"UPDATE app_settings SET docker_password = ?, last_updated = ? WHERE id = 1",
rusqlite::params![docker_password, now],
).ok();
}
if let Some(gotify_url) = &env_config.gotify_url {
conn_versions.execute(
"UPDATE app_settings SET gotify_url = ?, last_updated = ? WHERE id = 1",
rusqlite::params![gotify_url, now],
).ok();
}
if let Some(gotify_token) = &env_config.gotify_token {
conn_versions.execute(
"UPDATE app_settings SET gotify_token = ?, last_updated = ? WHERE id = 1",
rusqlite::params![gotify_token, now],
).ok();
}
if let Some(discord_webhook_url) = &env_config.discord_webhook_url {
conn_versions.execute(
"UPDATE app_settings SET discord_webhook_url = ?, last_updated = ? WHERE id = 1",
rusqlite::params![discord_webhook_url, now],
).ok();
}
if let Some(slack_webhook_url) = &env_config.slack_webhook_url {
conn_versions.execute(
"UPDATE app_settings SET slack_webhook_url = ?, last_updated = ? WHERE id = 1",
rusqlite::params![slack_webhook_url, now],
).ok();
}
conn_versions.execute(
"UPDATE app_settings SET check_interval = ?, last_updated = ? WHERE id = 1",
rusqlite::params![env_config.timeout as i64, now],
).ok();
info!("Configuration updated from environment variables (selective update)");
}
// Load configuration from database, with fallback to environment variables
let config = config::Config::from_database(&conn_versions);
// Check if configuration is complete
let config_is_incomplete = config.auth.is_empty() || (config.ntfy_url.is_none() && config.gotify_url.is_none()
&& config.discord_webhook_url.is_none() && config.slack_webhook_url.is_none());
let client = reqwest::Client::new();
// Now handle incomplete configuration
if config_is_incomplete {
info!("No notification service is configured.");
info!("Please configure at least one notification service via the web interface or environment variables.");
info!("Starting the REST API for configuration.");
// Start the REST API only if configuration is incomplete
start_api();
// Continue running to allow configuration through the API
loop {
thread::sleep(Duration::from_secs(60));
}
}
// Start the REST API only if configuration is complete
start_api();
info!("Starting version monitoring...");
loop {
let github_repos = database::get_watched_repos(&conn_repos)?;
let docker_repos = database::get_docker_watched_repos(&conn_repos)?;
let github_releases = github::get_latest_releases(&github_repos, &client, config.github_headers()).await;
let docker_releases = docker::get_latest_docker_releases(&docker_repos, &client, config.docker_headers()).await;
let _ = notifications::send_notifications(github_releases, docker_releases, &config, &conn_versions).await;
tokio::time::sleep(Duration::from_secs_f64(config.timeout)).await;
}
}

103
src/models.rs Normal file
View File

@@ -0,0 +1,103 @@
use serde::Deserialize;
use serde::Serialize;
// Structures for GitHub data
#[derive(Debug, Deserialize, Clone)]
pub struct GithubRelease {
pub name: String,
pub tag_name: String,
pub html_url: String,
pub published_at: Option<String>,
pub body: Option<String>,
}
#[derive(Debug, Clone)]
pub struct GithubReleaseInfo {
pub repo: String,
#[allow(dead_code)]
pub name: String,
pub tag_name: String,
pub html_url: String,
pub changelog: String,
pub published_at: String,
}
// Structures for Docker data
#[derive(Debug, Deserialize)]
pub struct DockerTag {
pub digest: String,
pub last_updated: String,
}
#[derive(Debug, Clone)]
pub struct DockerReleaseInfo {
pub repo: String,
pub digest: String,
pub html_url: String,
pub published_at: String,
}
#[allow(dead_code)]
pub struct NotifiedRelease {
pub repo: String,
pub tag_name: String,
pub notified_at: chrono::DateTime<chrono::Utc>,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct User {
pub id: i64,
pub username: String,
pub password_hash: String,
pub is_admin: bool,
pub created_at: chrono::DateTime<chrono::Utc>,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct UserLogin {
pub username: String,
pub password: String,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct UserRegistration {
pub username: String,
pub password: String,
pub is_admin: bool,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct Session {
pub token: String,
pub user_id: i64,
pub expires_at: chrono::DateTime<chrono::Utc>,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct AppSettings {
pub id: Option<i64>,
pub ntfy_url: Option<String>,
pub github_token: Option<String>,
pub docker_username: Option<String>,
pub docker_password: Option<String>,
pub gotify_url: Option<String>,
pub gotify_token: Option<String>,
pub discord_webhook_url: Option<String>,
pub slack_webhook_url: Option<String>,
pub check_interval: Option<i64>,
pub auth: Option<String>,
pub last_updated: chrono::DateTime<chrono::Utc>,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct AuthResponse {
pub token: String,
pub user: User,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct ApiResponse<T> {
pub success: bool,
pub message: String,
pub data: Option<T>,
}

View File

@@ -0,0 +1,85 @@
use log::{error, info};
use serde_json::json;
use reqwest::header::HeaderMap;
use crate::models::{GithubReleaseInfo, DockerReleaseInfo};
pub async fn send_github_notification(release: &GithubReleaseInfo, webhook_url: &str) {
let client = reqwest::Client::new();
let app_name = release.repo.split('/').last().unwrap_or(&release.repo);
let mut message = format!(
"📌 *New version*: {}\n\n📦*For*: {}\n\n📅 *Published on*: {}\n\n📝 *Changelog*:\n\n```{}```",
release.tag_name,
app_name,
release.published_at.replace("T", " ").replace("Z", ""),
release.changelog
);
if message.len() > 2000 {
message = format!(
"📌 *New version*: {}\n\n📦*For*: {}\n\n📅 *Published on*: {}\n\n🔗 *Release Link*: {}",
release.tag_name,
app_name,
release.published_at.replace("T", " ").replace("Z", ""),
release.html_url
);
}
let data = json!({
"content": message,
"username": "GitHub Ntfy"
});
let headers = HeaderMap::new();
match client.post(webhook_url)
.headers(headers)
.json(&data)
.send()
.await
{
Ok(response) if response.status().is_success() => {
info!("Message sent to Discord for {}", app_name);
},
Ok(response) => {
error!("Failed to send message to Discord. Status code: {}", response.status());
},
Err(e) => {
error!("Error sending to Discord: {}", e);
}
}
}
pub async fn send_docker_notification(release: &DockerReleaseInfo, webhook_url: &str) {
let client = reqwest::Client::new();
let app_name = release.repo.split('/').last().unwrap_or(&release.repo);
let message = format!(
"🐳 *Docker Image Updated!*\n\n🔐 *New Digest*: `{}`\n\n📦 *App*: {}\n\n📢 *Published*: {}\n\n🔗 *Link*: {}",
release.digest,
app_name,
release.published_at.replace("T", " ").replace("Z", ""),
release.html_url
);
let data = json!({
"content": message,
"username": "GitHub Ntfy"
});
match client.post(webhook_url)
.json(&data)
.send()
.await
{
Ok(response) if response.status().is_success() => {
info!("Message sent to Discord for {}", app_name);
},
Ok(response) => {
error!("Failed to send message to Discord. Status code: {}", response.status());
},
Err(e) => {
error!("Error sending to Discord: {}", e);
}
}
}

View File

@@ -0,0 +1,70 @@
use tokio::task;
use crate::models::DockerReleaseInfo;
use crate::config::Config;
use crate::notifications::{ntfy, gotify, discord, slack};
pub async fn send_to_ntfy(release: DockerReleaseInfo, auth: &str, ntfy_url: &str) {
ntfy::send_docker_notification(&release, auth, ntfy_url).await;
}
pub async fn send_to_gotify(release: DockerReleaseInfo, token: &str, gotify_url: &str) {
gotify::send_docker_notification(&release, token, gotify_url).await;
}
pub async fn send_to_discord(release: DockerReleaseInfo, webhook_url: &str) {
discord::send_docker_notification(&release, webhook_url).await;
}
pub async fn send_to_slack(release: DockerReleaseInfo, webhook_url: &str) {
slack::send_docker_notification(&release, webhook_url).await;
}
#[allow(dead_code)]
pub async fn send_notifications(releases: &[DockerReleaseInfo], config: &Config) {
let mut tasks = Vec::new();
for release in releases {
// Send to Ntfy
if let Some(url) = &config.ntfy_url {
let release_clone = release.clone();
let auth = config.auth.clone();
let url_clone = url.clone();
tasks.push(task::spawn(async move {
send_to_ntfy(release_clone, &auth, &url_clone).await;
}));
}
// Send to Gotify
if let (Some(gotify_url), Some(gotify_token)) = (&config.gotify_url, &config.gotify_token) {
let release_clone = release.clone();
let token = gotify_token.clone();
let url = gotify_url.clone();
tasks.push(task::spawn(async move {
send_to_gotify(release_clone, &token, &url).await;
}));
}
// Send to Discord
if let Some(discord_url) = &config.discord_webhook_url {
let release_clone = release.clone();
let url = discord_url.clone();
tasks.push(task::spawn(async move {
send_to_discord(release_clone, &url).await;
}));
}
// Send to Slack
if let Some(slack_url) = &config.slack_webhook_url {
let release_clone = release.clone();
let url = slack_url.clone();
tasks.push(task::spawn(async move {
send_to_slack(release_clone, &url).await;
}));
}
}
// Wait for all tasks to complete
for task in tasks {
let _ = task.await;
}
}

View File

@@ -0,0 +1,70 @@
use tokio::task;
use crate::models::GithubReleaseInfo;
use crate::config::Config;
use crate::notifications::{ntfy, gotify, discord, slack};
pub async fn send_to_ntfy(release: GithubReleaseInfo, auth: &str, ntfy_url: &str) {
ntfy::send_github_notification(&release, auth, ntfy_url).await;
}
pub async fn send_to_gotify(release: GithubReleaseInfo, token: &str, gotify_url: &str) {
gotify::send_github_notification(&release, token, gotify_url).await;
}
pub async fn send_to_discord(release: GithubReleaseInfo, webhook_url: &str) {
discord::send_github_notification(&release, webhook_url).await;
}
pub async fn send_to_slack(release: GithubReleaseInfo, webhook_url: &str) {
slack::send_github_notification(&release, webhook_url).await;
}
#[allow(dead_code)]
pub async fn send_notifications(releases: &[GithubReleaseInfo], config: &Config) {
let mut tasks = Vec::new();
for release in releases {
// Send to Ntfy
if let Some(url) = &config.ntfy_url {
let release_clone = release.clone();
let auth = config.auth.clone();
let url_clone = url.clone();
tasks.push(task::spawn(async move {
send_to_ntfy(release_clone, &auth, &url_clone).await;
}));
}
// Send to Gotify
if let (Some(gotify_url), Some(gotify_token)) = (&config.gotify_url, &config.gotify_token) {
let release_clone = release.clone();
let token = gotify_token.clone();
let url = gotify_url.clone();
tasks.push(task::spawn(async move {
send_to_gotify(release_clone, &token, &url).await;
}));
}
// Send to Discord
if let Some(discord_url) = &config.discord_webhook_url {
let release_clone = release.clone();
let url = discord_url.clone();
tasks.push(task::spawn(async move {
send_to_discord(release_clone, &url).await;
}));
}
// Send to Slack
if let Some(slack_url) = &config.slack_webhook_url {
let release_clone = release.clone();
let url = slack_url.clone();
tasks.push(task::spawn(async move {
send_to_slack(release_clone, &url).await;
}));
}
}
// Wait for all tasks to complete
for task in tasks {
let _ = task.await;
}
}

View File

@@ -0,0 +1,78 @@
use log::{error, info};
use serde_json::json;
use crate::models::{GithubReleaseInfo, DockerReleaseInfo};
pub async fn send_github_notification(release: &GithubReleaseInfo, token: &str, gotify_url: &str) {
let client = reqwest::Client::new();
let app_name = release.repo.split('/').last().unwrap_or(&release.repo);
let url = format!("{}/message?token={}", gotify_url, token);
let message = format!(
"📌 *New version*: {}\n\n📦*For*: {}\n\n📅 *Published on*: {}\n\n📝 *Changelog*:\n\n```{}```\n\n🔗 *Release Url*:{}",
release.tag_name,
app_name,
release.published_at.replace("T", " ").replace("Z", ""),
release.changelog,
release.html_url
);
let content = json!({
"title": format!("New version for {}", app_name),
"message": message,
"priority": "2"
});
match client.post(&url)
.json(&content)
.send()
.await
{
Ok(response) if response.status().is_success() => {
info!("Message sent to Gotify for {}", app_name);
},
Ok(response) => {
error!("Failed to send message to Gotify. Status code: {}", response.status());
},
Err(e) => {
error!("Error sending to Gotify: {}", e);
}
}
}
pub async fn send_docker_notification(release: &DockerReleaseInfo, token: &str, gotify_url: &str) {
let client = reqwest::Client::new();
let app_name = release.repo.split('/').last().unwrap_or(&release.repo);
let url = format!("{}/message?token={}", gotify_url, token);
let message = format!(
"🐳 *Docker Image Updated!*\n\n🔐 *New Digest*: `{}`\n\n📦 *App*: {}\n\n📢 *Published*: {}\n\n🔗 *Release Url*:{}",
release.digest,
app_name,
release.published_at.replace("T", " ").replace("Z", ""),
release.html_url
);
let content = json!({
"title": format!("New version for {}", app_name),
"message": message,
"priority": "2"
});
match client.post(&url)
.json(&content)
.send()
.await
{
Ok(response) if response.status().is_success() => {
info!("Message sent to Gotify for {}", app_name);
},
Ok(response) => {
error!("Failed to send message to Gotify. Status code: {}", response.status());
},
Err(e) => {
error!("Error sending to Gotify: {}", e);
}
}
}

109
src/notifications/mod.rs Normal file
View File

@@ -0,0 +1,109 @@
pub mod ntfy;
pub mod gotify;
pub mod discord;
pub mod slack;
pub mod github;
pub mod docker;
use tokio::task;
use crate::models::{GithubReleaseInfo, DockerReleaseInfo};
use crate::config::Config;
use crate::database::{Connection, is_new_version, update_version};
use rusqlite::Result as SqliteResult;
pub async fn send_notifications(
github_releases: Vec<GithubReleaseInfo>,
docker_releases: Vec<DockerReleaseInfo>,
config: &Config,
db_conn: &Connection,
) -> SqliteResult<()> {
let mut tasks = Vec::new();
// Create tasks for GitHub notifications
for release in &github_releases {
if is_new_version(db_conn, &release.repo, &release.tag_name)? {
if let Some(url) = &config.ntfy_url {
let release = release.clone();
let auth = config.auth.clone();
let url = url.clone();
tasks.push(task::spawn(async move {
github::send_to_ntfy(release, &auth, &url).await;
}));
}
if let (Some(gotify_url), Some(gotify_token)) = (&config.gotify_url, &config.gotify_token) {
let release = release.clone();
let url = gotify_url.clone();
let token = gotify_token.clone();
tasks.push(task::spawn(async move {
github::send_to_gotify(release, &token, &url).await;
}));
}
if let Some(discord_url) = &config.discord_webhook_url {
let release = release.clone();
let url = discord_url.clone();
tasks.push(task::spawn(async move {
github::send_to_discord(release, &url).await;
}));
}
if let Some(slack_url) = &config.slack_webhook_url {
let release = release.clone();
let url = slack_url.clone();
tasks.push(task::spawn(async move {
github::send_to_slack(release, &url).await;
}));
}
update_version(db_conn, &release.repo, &release.tag_name, Some(release.changelog.as_str()))?;
}
}
for release in &docker_releases {
if is_new_version(db_conn, &release.repo, &release.digest)? {
if let Some(url) = &config.ntfy_url {
let release = release.clone();
let auth = config.auth.clone();
let url = url.clone();
tasks.push(task::spawn(async move {
docker::send_to_ntfy(release, &auth, &url).await;
}));
}
if let (Some(gotify_url), Some(gotify_token)) = (&config.gotify_url, &config.gotify_token) {
let release = release.clone();
let url = gotify_url.clone();
let token = gotify_token.clone();
tasks.push(task::spawn(async move {
docker::send_to_gotify(release, &token, &url).await;
}));
}
if let Some(discord_url) = &config.discord_webhook_url {
let release = release.clone();
let url = discord_url.clone();
tasks.push(task::spawn(async move {
docker::send_to_discord(release, &url).await;
}));
}
if let Some(slack_url) = &config.slack_webhook_url {
let release = release.clone();
let url = slack_url.clone();
tasks.push(task::spawn(async move {
docker::send_to_slack(release, &url).await;
}));
}
update_version(db_conn, &release.repo, &release.digest, None)?;
}
}
// Wait for all tasks to complete
for task in tasks {
let _ = task.await;
}
Ok(())
}

84
src/notifications/ntfy.rs Normal file
View File

@@ -0,0 +1,84 @@
use log::{error, info};
use reqwest::header::{HeaderMap, HeaderValue};
use crate::models::{GithubReleaseInfo, DockerReleaseInfo};
pub async fn send_github_notification(release: &GithubReleaseInfo, auth: &str, ntfy_url: &str) {
let client = reqwest::Client::new();
let app_name = release.repo.split('/').last().unwrap_or(&release.repo);
let mut headers = HeaderMap::new();
headers.insert("Authorization", HeaderValue::from_str(&format!("Basic {}", auth))
.unwrap_or_else(|_| HeaderValue::from_static("")));
headers.insert("Title", HeaderValue::from_str(&format!("New version for {}", app_name))
.unwrap_or_else(|_| HeaderValue::from_static("")));
headers.insert("Priority", HeaderValue::from_static("urgent"));
headers.insert("Markdown", HeaderValue::from_static("yes"));
headers.insert("Actions", HeaderValue::from_str(&format!("view, Update {}, {}, clear=true", app_name, release.html_url))
.unwrap_or_else(|_| HeaderValue::from_static("")));
let message = format!(
"📌 *New version*: {}\n\n📦*For*: {}\n\n📅 *Published on*: {}\n\n📝 *Changelog*:\n\n```{}```\n\n 🔗 *Release Url*: {}",
release.tag_name,
app_name,
release.published_at.replace("T", " ").replace("Z", ""),
release.changelog,
release.html_url
);
match client.post(ntfy_url)
.headers(headers)
.body(message)
.send()
.await
{
Ok(response) if response.status().is_success() => {
info!("Message sent to Ntfy for {}", app_name);
},
Ok(response) => {
error!("Failed to send message to Ntfy. Status code: {}", response.status());
},
Err(e) => {
error!("Error sending to Ntfy: {}", e);
}
}
}
pub async fn send_docker_notification(release: &DockerReleaseInfo, auth: &str, ntfy_url: &str) {
let client = reqwest::Client::new();
let app_name = release.repo.split('/').last().unwrap_or(&release.repo);
let mut headers = HeaderMap::new();
headers.insert("Authorization", HeaderValue::from_str(&format!("Basic {}", auth))
.unwrap_or_else(|_| HeaderValue::from_static("")));
headers.insert("Title", HeaderValue::from_str(&format!("🆕 New version for {}", app_name))
.unwrap_or_else(|_| HeaderValue::from_static("")));
headers.insert("Priority", HeaderValue::from_static("urgent"));
headers.insert("Markdown", HeaderValue::from_static("yes"));
headers.insert("Actions", HeaderValue::from_str(&format!("View, Update {}, {}, clear=true", app_name, release.html_url))
.unwrap_or_else(|_| HeaderValue::from_static("")));
let message = format!(
"🐳 *Docker Image Updated!*\n\n🔐 *New Digest*: `{}`\n\n📦 *App*: {}\n\n📢 *Published*: {}\n\n 🔗 *Release Url*: {}",
release.digest,
app_name,
release.published_at.replace("T", " ").replace("Z", ""),
release.html_url
);
match client.post(ntfy_url)
.headers(headers)
.body(message)
.send()
.await
{
Ok(response) if response.status().is_success() => {
info!("Message sent to Ntfy for {}", app_name);
},
Ok(response) => {
error!("Failed to send message to Ntfy. Status code: {}", response.status());
},
Err(e) => {
error!("Error sending to Ntfy: {}", e);
}
}
}

131
src/notifications/slack.rs Normal file
View File

@@ -0,0 +1,131 @@
use log::{error, info};
use serde_json::json;
use reqwest::header::{HeaderMap, HeaderValue, CONTENT_TYPE};
use std::iter::FromIterator;
use crate::models::{GithubReleaseInfo, DockerReleaseInfo};
pub async fn send_github_notification(release: &GithubReleaseInfo, webhook_url: &str) {
let client = reqwest::Client::new();
let app_name = release.repo.split('/').last().unwrap_or(&release.repo);
let mut message = format!(
"📌 *New version*: {}\n\n📦*For*: {}\n\n📅 *Published on*: {}\n\n📝 *Changelog*:\n\n```{}```",
release.tag_name,
app_name,
release.published_at.replace("T", " ").replace("Z", ""),
release.changelog
);
if message.len() > 2000 {
message = format!(
"📌 *New version*: {}\n\n📦*For*: {}\n\n📅 *Published on*: {}\n\n📝 *Changelog*:\n\n `truncated..` use 🔗 instead",
release.tag_name,
app_name,
release.published_at.replace("T", " ").replace("Z", "")
);
}
let data = json!({
"blocks": [
{
"type": "section",
"text": {
"type": "mrkdwn",
"text": message
},
"accessory": {
"type": "button",
"text": {
"type": "plain_text",
"text": "View Release"
},
"url": release.html_url,
"action_id": "button-action"
}
},
{
"type": "divider"
}
]
});
let headers = HeaderMap::from_iter([(
CONTENT_TYPE,
HeaderValue::from_static("application/json")
)]);
match client.post(webhook_url)
.headers(headers)
.json(&data)
.send()
.await
{
Ok(response) if response.status().is_success() => {
info!("Message sent to Slack for {}", app_name);
},
Ok(response) => {
error!("Failed to send message to Slack. Status code: {}", response.status());
},
Err(e) => {
error!("Error sending to Slack: {}", e);
}
}
}
pub async fn send_docker_notification(release: &DockerReleaseInfo, webhook_url: &str) {
let client = reqwest::Client::new();
let app_name = release.repo.split('/').last().unwrap_or(&release.repo);
let message = format!(
"🐳 *Docker Image Updated!*\n\n🔐 *New Digest*: `{}`\n\n📦 *App*: {}\n\n📢*Published*: {}",
release.digest,
app_name,
release.published_at.replace("T", " ").replace("Z", "")
);
let data = json!({
"blocks": [
{
"type": "section",
"text": {
"type": "mrkdwn",
"text": message
},
"accessory": {
"type": "button",
"text": {
"type": "plain_text",
"text": "View Image"
},
"url": release.html_url,
"action_id": "button-action"
}
},
{
"type": "divider"
}
]
});
let headers = HeaderMap::from_iter([(
CONTENT_TYPE,
HeaderValue::from_static("application/json")
)]);
match client.post(webhook_url)
.headers(headers)
.json(&data)
.send()
.await
{
Ok(response) if response.status().is_success() => {
info!("Message sent to Slack for {}", app_name);
},
Ok(response) => {
error!("Failed to send message to Slack. Status code: {}", response.status());
},
Err(e) => {
error!("Error sending to Slack: {}", e);
}
}
}

24
web/.gitignore vendored Normal file
View File

@@ -0,0 +1,24 @@
# Nuxt dev/build outputs
.output
.data
.nuxt
.nitro
.cache
dist
# Node dependencies
node_modules
# Logs
logs
*.log
# Misc
.DS_Store
.fleet
.idea
# Local env files
.env
.env.*
!.env.example

75
web/README.md Normal file
View File

@@ -0,0 +1,75 @@
# Nuxt Minimal Starter
Look at the [Nuxt documentation](https://nuxt.com/docs/getting-started/introduction) to learn more.
## Setup
Make sure to install dependencies:
```bash
# npm
npm install
# pnpm
pnpm install
# yarn
yarn install
# bun
bun install
```
## Development Server
Start the development server on `http://localhost:3000`:
```bash
# npm
npm run dev
# pnpm
pnpm dev
# yarn
yarn dev
# bun
bun run dev
```
## Production
Build the application for production:
```bash
# npm
npm run build
# pnpm
pnpm build
# yarn
yarn build
# bun
bun run build
```
Locally preview production build:
```bash
# npm
npm run preview
# pnpm
pnpm preview
# yarn
yarn preview
# bun
bun run preview
```
Check out the [deployment documentation](https://nuxt.com/docs/getting-started/deployment) for more information.

19
web/app.vue Normal file
View File

@@ -0,0 +1,19 @@
<template>
<div class="min-h-screen bg-gray-900 text-gray-200">
<UContainer>
<AppHeader />
<main class="py-8">
<NuxtPage />
</main>
<AppFooter />
</UContainer>
</div>
</template>
<script setup>
// No script content provided in the original code
</script>
<style>
/* No style content provided in the original code */
</style>

2
web/assets/css/main.css Normal file
View File

@@ -0,0 +1,2 @@
@import 'tailwindcss';
@import '@nuxt/ui';

View File

@@ -0,0 +1,6 @@
<template>
<footer class="text-center py-6 bg-emerald-950 rounded-t-lg mt-4">
<p class="text-sm">I know this web interface is simple, but I'm improving!</p>
</footer>
</template>

View File

@@ -0,0 +1,41 @@
<template>
<header class="py-6 bg-emerald-950 shadow-lg rounded-b-lg mb-4">
<div class="container mx-auto px-4 flex justify-between items-center">
<NuxtLink to="/" class="text-white hover:text-gray-200 transition-colors duration-200">
<h1 class="text-4xl font-bold tracking-wide">Github Ntfy</h1>
</NuxtLink>
<div v-if="auth.isAuthenticated" class="flex space-x-3">
<UButton
to="/settings"
variant="ghost"
color="white"
icon="i-heroicons-cog-6-tooth"
size="sm"
>
Settings
</UButton>
<UButton
@click="handleLogout"
variant="ghost"
color="white"
icon="i-heroicons-arrow-right-on-rectangle"
size="sm"
>
Logout
</UButton>
</div>
</div>
</header>
</template>
<script setup>
const auth = useAuth();
const router = useRouter();
const handleLogout = async () => {
await auth.logout();
router.push('/login');
};
</script>

View File

@@ -0,0 +1,106 @@
<template>
<UCard class="bg-emerald-950 shadow-lg">
<template #header>
<h2 class="text-2xl font-semibold">Add a Docker Repo</h2>
</template>
<form @submit.prevent="addDockerRepo">
<UFormGroup label="Name of the Docker Repo" name="dockerRepo">
<div class="flex items-center">
<UBadge class="mr-2 py-2.5 px-3 bg-gray-700 text-gray-400">hub.docker.com/r/</UBadge>
<UInput
v-model="dockerRepoName"
placeholder="breizhhardware/github-ntfy"
class="flex-1 bg-gray-700"
/>
</div>
</UFormGroup>
<div class="flex justify-end gap-4 mt-4">
<UButton color="gray" variant="ghost" @click="dockerRepoName = ''">Cancel</UButton>
<UButton type="submit" color="green" variant="solid">Save</UButton>
</div>
</form>
<template #footer>
<div class="mt-4">
<h3 class="text-lg font-semibold mb-2">Watched Docker Repositories</h3>
<UList v-if="watchedDockerRepos.length" class="space-y-2">
<UListItem v-for="repo in watchedDockerRepos" :key="repo" class="flex justify-between items-center">
<span>{{ repo }}</span>
<UButton
color="red"
variant="ghost"
icon="i-heroicons-x-mark"
size="xs"
@click="removeDockerRepo(repo)"
/>
</UListItem>
</UList>
<p v-else class="text-gray-400 italic">No Docker repositories being watched</p>
</div>
</template>
</UCard>
</template>
<script setup>
const dockerRepoName = ref('')
const watchedDockerRepos = ref([])
onMounted(() => {
refreshWatchedDockerRepos()
})
async function addDockerRepo() {
if (!dockerRepoName.value) return
try {
const response = await fetch('/app_docker_repo', {
method: 'POST',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify({ repo: dockerRepoName.value })
})
if (response.ok) {
dockerRepoName.value = ''
await refreshWatchedDockerRepos()
} else {
throw new Error('Failed to add Docker repository')
}
} catch (error) {
console.error('Error:', error)
}
}
async function refreshWatchedDockerRepos() {
try {
const response = await fetch('/watched_docker_repos')
if (response.ok) {
watchedDockerRepos.value = await response.json()
}
} catch (error) {
console.error('Error fetching watched Docker repos:', error)
}
}
async function removeDockerRepo(repo) {
try {
const response = await fetch('/delete_docker_repo', {
method: 'POST',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify({ repo })
})
if (response.ok) {
await refreshWatchedDockerRepos()
} else {
throw new Error('Failed to remove Docker repository')
}
} catch (error) {
console.error('Error:', error)
}
}
</script>

View File

@@ -0,0 +1,106 @@
<template>
<UCard class="bg-emerald-950 shadow-lg">
<template #header>
<h2 class="text-2xl font-semibold">Add a Github Repo</h2>
</template>
<form @submit.prevent="addRepo">
<UFormGroup label="Name of the Github Repo" name="repo">
<div class="flex items-center">
<UBadge class="mr-2 py-2.5 px-3 bg-gray-700 text-gray-400">github.com/</UBadge>
<UInput
v-model="repoName"
placeholder="BreizhHardware/ntfy_alerts"
class="flex-1 bg-gray-700"
/>
</div>
</UFormGroup>
<div class="flex justify-end gap-4 mt-4">
<UButton color="gray" variant="ghost" @click="repoName = ''">Cancel</UButton>
<UButton type="submit" color="green" variant="solid">Save</UButton>
</div>
</form>
<template #footer>
<div class="mt-4">
<h3 class="text-lg font-semibold mb-2">Watched Github Repositories</h3>
<UList v-if="watchedRepos.length" class="space-y-2">
<UListItem v-for="repo in watchedRepos" :key="repo" class="flex justify-between items-center">
<span>{{ repo }}</span>
<UButton
color="red"
variant="ghost"
icon="i-heroicons-x-mark"
size="xs"
@click="removeRepo(repo)"
/>
</UListItem>
</UList>
<p v-else class="text-gray-400 italic">No repositories being watched</p>
</div>
</template>
</UCard>
</template>
<script setup>
const repoName = ref('')
const watchedRepos = ref([])
onMounted(() => {
refreshWatchedRepos()
})
async function addRepo() {
if (!repoName.value) return
try {
const response = await fetch('/app_github_repo', {
method: 'POST',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify({ repo: repoName.value })
})
if (response.ok) {
repoName.value = ''
await refreshWatchedRepos()
} else {
throw new Error('Failed to add repository')
}
} catch (error) {
console.error('Error:', error)
}
}
async function refreshWatchedRepos() {
try {
const response = await fetch('/watched_repos')
if (response.ok) {
watchedRepos.value = await response.json()
}
} catch (error) {
console.error('Error fetching watched repos:', error)
}
}
async function removeRepo(repo) {
try {
const response = await fetch('/delete_repo', {
method: 'POST',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify({ repo })
})
if (response.ok) {
await refreshWatchedRepos()
} else {
throw new Error('Failed to remove repository')
}
} catch (error) {
console.error('Error:', error)
}
}
</script>

View File

@@ -0,0 +1,92 @@
<template>
<UCard class="bg-gray-800 shadow-lg mb-8">
<template #header>
<h2 class="text-2xl font-semibold">Latest Updates</h2>
</template>
<div class="space-y-4">
<div v-for="(update, index) in latestUpdates" :key="index" class="border border-gray-700 rounded-md overflow-hidden">
<button
@click="toggleChangelog(index)"
class="w-full flex justify-between items-center px-4 py-3 bg-gray-700 hover:bg-gray-600 transition-colors text-left"
>
<div>
<span class="font-medium">{{ update.repo }} - {{ update.version }}</span>
<div class="text-sm text-gray-400">{{ update.date }}</div>
</div>
<UIcon :name="openStates[index] ? 'i-heroicons-chevron-up' : 'i-heroicons-chevron-down'" class="text-gray-400" />
</button>
<div
v-show="openStates[index]"
class="p-4 bg-gray-800 prose prose-invert max-w-none transition-all"
v-html="renderedChangelogs[index]"
></div>
</div>
</div>
</UCard>
</template>
<script setup>
import { marked } from 'marked';
const latestUpdates = ref([]);
const renderedChangelogs = ref([]);
const openStates = ref([]);
onMounted(async () => {
try {
const response = await fetch('/latest_updates');
if (response.ok) {
latestUpdates.value = await response.json();
renderedChangelogs.value = latestUpdates.value.map(update =>
marked(update.changelog)
);
openStates.value = Array(latestUpdates.value.length).fill(false);
} else {
console.error('Erreur lors de la récupération des mises à jour');
}
} catch (error) {
console.error('Erreur:', error);
}
});
function toggleChangelog(index) {
openStates.value[index] = !openStates.value[index];
}
</script>
<style>
.prose h1, .prose h2, .prose h3 {
margin-top: 1em;
margin-bottom: 0.5em;
font-weight: 600;
}
.prose ul {
list-style-type: disc;
padding-left: 1.5em;
margin: 0.5em 0;
}
.prose p {
margin: 0.5em 0;
}
.prose a {
color: #60a5fa;
text-decoration: underline;
}
.prose code {
background-color: rgba(0, 0, 0, 0.1);
padding: 0.1em 0.3em;
border-radius: 0.2em;
}
.prose blockquote {
border-left: 4px solid #4b5563;
padding-left: 1em;
font-style: italic;
margin: 0.5em 0;
}
</style>

154
web/composables/useAuth.js Normal file
View File

@@ -0,0 +1,154 @@
// Composable for managing authentication
export const useAuth = () => {
const user = useState('user', () => null);
const token = useState('token', () => null);
const isFirstLogin = useState('isFirstLogin', () => false);
// Initialize authentication state from localStorage
onMounted(() => {
const storedToken = localStorage.getItem('token');
const storedUser = localStorage.getItem('user');
if (storedToken && storedUser) {
token.value = storedToken;
user.value = JSON.parse(storedUser);
}
});
// Login function
const login = async (username, password) => {
try {
const response = await fetch('/auth/login', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({ username, password }),
});
if (!response.ok) {
const error = await response.json();
throw new Error(error.message || 'Login failed');
}
const data = await response.json();
if (!data.success || !data.data) {
throw new Error(data.message || 'Login failed');
}
// Store authentication information
token.value = data.data.token;
user.value = data.data.user;
localStorage.setItem('token', data.data.token);
localStorage.setItem('user', JSON.stringify(data.data.user));
// Check if this is the first login
const configResponse = await fetch('/is_configured');
if (configResponse.ok) {
const configData = await configResponse.json();
isFirstLogin.value = !configData.data.settings_exist;
}
return data;
} catch (error) {
console.error('Login error:', error);
throw error;
}
};
// Registration function
const register = async (username, password, isAdmin = false, isPending = false) => {
try {
const response = await fetch('/auth/register', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
username,
password,
is_admin: isAdmin,
is_pending: isPending
}),
});
if (!response.ok) {
const error = await response.json();
throw new Error(error.message || 'Registration failed');
}
const data = await response.json();
if (!data.success || !data.data) {
throw new Error(data.message || 'Registration failed');
}
// If registration is pending, don't store auth info
if (isPending) {
return data;
}
// Store authentication information
token.value = data.data.token;
user.value = data.data.user;
localStorage.setItem('token', data.data.token);
localStorage.setItem('user', JSON.stringify(data.data.user));
// By default, consider a new registration needs onboarding
isFirstLogin.value = true;
return data;
} catch (error) {
console.error('Registration error:', error);
throw error;
}
};
// Logout function
const logout = async () => {
try {
if (token.value) {
await fetch('/auth/logout', {
method: 'POST',
headers: {
'Authorization': token.value,
},
});
}
} catch (error) {
console.error('Logout error:', error);
} finally {
// Clean up local authentication data
token.value = null;
user.value = null;
localStorage.removeItem('token');
localStorage.removeItem('user');
}
};
// Check if user is authenticated
const isAuthenticated = computed(() => !!token.value && !!user.value);
// Check if user is admin
const isAdmin = computed(() => isAuthenticated.value && user.value?.is_admin);
// Get token for authenticated requests
const getAuthHeader = () => {
return token.value ? { Authorization: token.value } : {};
};
return {
user,
token,
isFirstLogin,
login,
register,
logout,
isAuthenticated,
isAdmin,
getAuthHeader,
};
};

22
web/nuxt.config.ts Normal file
View File

@@ -0,0 +1,22 @@
// https://nuxt.com/docs/api/configuration/nuxt-config
export default defineNuxtConfig({
compatibilityDate: '2025-05-15',
devtools: { enabled: true },
modules: [
'@nuxt/ui'
],
ui: {
global: true,
icons: ['heroicons']
},
css: ['~/assets/css/main.css'],
postcss: {
plugins: {
'@tailwindcss/postcss': {},
autoprefixer: {},
},
},
plugins: [
'~/plugins/auth.js'
]
})

27
web/package.json Normal file
View File

@@ -0,0 +1,27 @@
{
"name": "nuxt-app",
"private": true,
"type": "module",
"scripts": {
"build": "nuxt build",
"dev": "nuxt dev",
"generate": "nuxt generate",
"preview": "nuxt preview",
"postinstall": "nuxt prepare"
},
"dependencies": {
"@nuxt/icon": "1.14.0",
"@nuxt/ui": "3.1.3",
"marked": "^15.0.12",
"nuxt": "^3.17.5",
"typescript": "^5.8.3",
"vue": "^3.5.16",
"vue-router": "^4.5.1"
},
"devDependencies": {
"@nuxtjs/tailwindcss": "7.0.0-beta.0",
"@tailwindcss/postcss": "^4.1.10",
"postcss": "^8.5.6",
"tailwindcss": "^4.1.10"
}
}

12
web/pages/index.vue Normal file
View File

@@ -0,0 +1,12 @@
<template>
<div>
<!-- Section des dernières mises à jour -->
<LatestUpdates />
<!-- Section des dépôts GitHub et Docker -->
<div class="grid grid-cols-1 md:grid-cols-2 gap-8">
<GithubRepoSection />
<DockerRepoSection />
</div>
</div>
</template>

91
web/pages/login.vue Normal file
View File

@@ -0,0 +1,91 @@
<template>
<div class="flex items-center justify-center min-h-screen bg-gray-900">
<div class="w-full max-w-md p-8 space-y-8 bg-gray-800 rounded-lg shadow-lg">
<div class="text-center">
<h1 class="text-2xl font-bold text-white">Login</h1>
<p class="mt-2 text-sm text-gray-400">Sign in to manage your notifications</p>
</div>
<form @submit.prevent="handleLogin" class="mt-8 space-y-6">
<div>
<label for="username" class="block text-sm font-medium text-gray-400">Username</label>
<input
id="username"
v-model="form.username"
type="text"
required
class="block w-full px-3 py-2 mt-1 text-white placeholder-gray-500 bg-gray-700 border border-gray-600 rounded-md shadow-sm focus:outline-none focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm"
/>
</div>
<div>
<label for="password" class="block text-sm font-medium text-gray-400">Password</label>
<input
id="password"
v-model="form.password"
type="password"
required
class="block w-full px-3 py-2 mt-1 text-white placeholder-gray-500 bg-gray-700 border border-gray-600 rounded-md shadow-sm focus:outline-none focus:ring-indigo-500 focus:border-indigo-500 sm:text-sm"
/>
</div>
<div v-if="error" class="p-3 text-sm text-red-500 bg-red-100 rounded-md">
{{ error }}
</div>
<div>
<UButton
type="submit"
color="primary"
block
:loading="loading"
>
Login
</UButton>
</div>
</form>
<div class="text-center mt-4">
<p class="text-sm text-gray-400">
First time?
<NuxtLink to="/onboarding" class="font-medium text-indigo-400 hover:text-indigo-300">
Setup your application
</NuxtLink>
</p>
</div>
</div>
</div>
</template>
<script setup>
const auth = useAuth();
const router = useRouter();
const form = reactive({
username: '',
password: ''
});
const error = ref('');
const loading = ref(false);
async function handleLogin() {
try {
loading.value = true;
error.value = '';
await auth.login(form.username, form.password);
// Redirect to main page or configuration page if needed
if (auth.isFirstLogin.value) {
router.push('/onboarding');
} else {
router.push('/');
}
} catch (err) {
error.value = err.message || 'An error occurred during login';
} finally {
loading.value = false;
}
}
</script>

489
web/pages/onboarding.vue Normal file
View File

@@ -0,0 +1,489 @@
<template>
<div class="min-h-screen bg-gray-900 p-6">
<div class="max-w-3xl mx-auto bg-gray-800 rounded-lg shadow-lg overflow-hidden">
<div class="p-6 border-b border-gray-700">
<h1 class="text-2xl font-bold text-white">Application Setup</h1>
<p class="mt-2 text-gray-400">Configure your application and create an administrator account</p>
</div>
<UStepper v-model="step" :items="steps" class="p-6">
<template #item="{ item }">
<h2 class="text-lg font-medium">{{ item.title }}</h2>
<p class="text-sm text-gray-400">{{ item.description }}</p>
</template>
</UStepper>
<div class="p-6">
<!-- Step 1: Create Administrator Account -->
<div v-if="step === 0" class="space-y-6">
<div>
<h3 class="text-lg font-medium text-white mb-4">Create Administrator Account</h3>
<p class="text-sm text-gray-400 mb-6">This account will have full access to manage the application</p>
<div class="space-y-4">
<div>
<label for="username" class="block text-sm font-medium text-gray-400">Username</label>
<UInput
id="username"
v-model="adminUser.username"
placeholder="admin"
class="w-full"
/>
</div>
<div>
<label for="password" class="block text-sm font-medium text-gray-400">Password</label>
<UInput
id="password"
v-model="adminUser.password"
type="password"
placeholder="********"
class="w-full"
/>
</div>
<div>
<label for="confirmPassword" class="block text-sm font-medium text-gray-400">Confirm Password</label>
<UInput
id="confirmPassword"
v-model="adminUser.confirmPassword"
type="password"
placeholder="********"
class="w-full"
/>
</div>
</div>
</div>
</div>
<!-- Step 2: Main notification service -->
<div v-if="step === 1" class="space-y-6">
<div>
<label class="block text-sm font-medium text-gray-400 mb-2">Main notification service</label>
<USelect
v-model="selectedService"
:items="notificationServices"
placeholder="Select a notification service"
/>
</div>
<!-- NTFY Configuration -->
<div v-if="selectedService === 'ntfy'" class="space-y-4">
<div>
<label for="ntfy_url" class="block text-sm font-medium text-gray-400">NTFY URL</label>
<UInput
id="ntfy_url"
v-model="settings.ntfy_url"
placeholder="https://ntfy.sh/your-topic"
class="w-full"
/>
</div>
<div>
<label for="ntfy_username" class="block text-sm font-medium text-gray-400">NTFY Username</label>
<UInput
id="ntfy_username"
v-model="settings.ntfy_username"
placeholder="username"
class="w-full"
/>
</div>
<div>
<label for="ntfy_password" class="block text-sm font-medium text-gray-400">NTFY Password</label>
<UInput
id="ntfy_password"
v-model="settings.ntfy_password"
type="password"
placeholder="********"
class="w-full"
/>
<p class="mt-1 text-xs text-gray-500">
Username and password will be used to generate the auth.txt file
</p>
</div>
</div>
<!-- Discord Configuration -->
<div v-if="selectedService === 'discord'" class="space-y-4">
<div>
<label for="discord_webhook" class="block text-sm font-medium text-gray-400">Discord Webhook URL</label>
<UInput
id="discord_webhook"
v-model="settings.discord_webhook_url"
placeholder="https://discord.com/api/webhooks/..."
class="w-full"
/>
</div>
</div>
<!-- Slack Configuration -->
<div v-if="selectedService === 'slack'" class="space-y-4">
<div>
<label for="slack_webhook" class="block text-sm font-medium text-gray-400">Slack Webhook URL</label>
<UInput
id="slack_webhook"
v-model="settings.slack_webhook_url"
placeholder="https://hooks.slack.com/services/..."
class="w-full"
/>
</div>
</div>
<!-- Gotify Configuration -->
<div v-if="selectedService === 'gotify'" class="space-y-4">
<div>
<label for="gotify_url" class="block text-sm font-medium text-gray-400">Gotify URL</label>
<UInput
id="gotify_url"
v-model="settings.gotify_url"
placeholder="https://gotify.example.com"
class="w-full"
/>
</div>
<div>
<label for="gotify_token" class="block text-sm font-medium text-gray-400">Gotify Token</label>
<UInput
id="gotify_token"
v-model="settings.gotify_token"
placeholder="Axxxxxxxxx.xxxxx"
class="w-full"
/>
</div>
</div>
</div>
<!-- Step 3: GitHub Settings -->
<div v-if="step === 2" class="space-y-6">
<div>
<label for="github_token" class="block text-sm font-medium text-gray-400">GitHub Token (optional)</label>
<UInput
id="github_token"
v-model="settings.github_token"
placeholder="ghp_xxxxxxxxxxxxxxxx"
class="w-full"
/>
<p class="mt-1 text-xs text-gray-500">
A GitHub token helps avoid API rate limits for private repositories
</p>
</div>
</div>
<!-- Step 4: Docker Hub Settings -->
<div v-if="step === 3" class="space-y-6">
<div>
<label for="docker_username" class="block text-sm font-medium text-gray-400">Docker Hub Username (optional)</label>
<UInput
id="docker_username"
v-model="settings.docker_username"
placeholder="username"
class="w-full"
/>
</div>
<div>
<label for="docker_password" class="block text-sm font-medium text-gray-400">Docker Hub Password (optionnel)</label>
<UInput
id="docker_password"
v-model="settings.docker_password"
type="password"
placeholder="********"
class="w-full"
/>
<p class="mt-1 text-xs text-gray-500">
Docker Hub credentials allow access to private images
</p>
</div>
</div>
<!-- Step 5: Advanced Settings -->
<div v-if="step === 4" class="space-y-6">
<div>
<label for="check_interval" class="block text-sm font-medium text-gray-400">Check Interval (seconds)</label>
<UInput
id="check_interval"
v-model="settings.check_interval"
type="number"
min="60"
placeholder="3600"
class="w-full"
/>
<p class="mt-1 text-xs text-gray-500">
Default interval is 3600 seconds (1 hour)
</p>
</div>
</div>
<div v-if="error" class="mt-6 p-3 text-sm text-red-500 bg-red-100 rounded-md">
{{ error }}
</div>
<div class="flex justify-between mt-8">
<UButton
v-if="step > 0"
@click="step--"
color="gray"
>
Previous
</UButton>
<div v-else></div>
<UButton
v-if="step < steps.length - 1"
@click="nextStep"
color="primary"
>
Next
</UButton>
<UButton
v-else
@click="saveSettings"
color="primary"
:loading="loading"
>
Complete Setup
</UButton>
</div>
</div>
</div>
</div>
</template>
<script setup>
const auth = useAuth();
const router = useRouter();
const route = useRoute();
// Check if admin exists and redirect accordingly
onMounted(async () => {
try {
// Check if admin exists
const response = await fetch('/is_configured');
if (response.ok) {
const data = await response.json();
const adminExists = data.data && data.data.admin_exists;
// If admin exists, redirect to login or dashboard
// This ensures onboarding can only be done once
if (adminExists) {
if (auth.isAuthenticated.value) {
router.push('/');
} else {
router.push('/login');
}
return;
}
}
// Only load existing settings if we're continuing with onboarding
// (only happens when no admin exists yet)
await loadExistingSettings();
} catch (err) {
console.error('Error checking configuration:', err);
}
});
// Admin user creation data
const adminUser = reactive({
username: '',
password: '',
confirmPassword: '',
});
// Onboarding steps
const steps = [
{ title: 'Create Admin', description: 'Create your administrator account' },
{ title: 'Notification Service', description: 'Choose your main notification service' },
{ title: 'GitHub Settings', description: 'Configure options for GitHub' },
{ title: 'Docker Hub Settings', description: 'Configure options for Docker Hub' },
{ title: 'Advanced Settings', description: 'Configure additional options' }
];
const step = ref(0);
const selectedService = ref(null);
const error = ref('');
const loading = ref(false);
// List of available notification services
const notificationServices = [
{ label: 'NTFY', value: 'ntfy' },
{ label: 'Discord', value: 'discord' },
{ label: 'Slack', value: 'slack' },
{ label: 'Gotify', value: 'gotify' }
];
// Application settings
const settings = reactive({
ntfy_url: '',
ntfy_username: '',
ntfy_password: '',
github_token: '',
docker_username: '',
docker_password: '',
gotify_url: '',
gotify_token: '',
discord_webhook_url: '',
slack_webhook_url: '',
check_interval: 3600
});
// Function to proceed to next step
async function nextStep() {
// Validate current step
if (step.value === 0) {
// Validate admin user creation
if (!adminUser.username) {
error.value = 'Please enter a username';
return;
}
if (!adminUser.password) {
error.value = 'Please enter a password';
return;
}
if (adminUser.password !== adminUser.confirmPassword) {
error.value = 'Passwords do not match';
return;
}
// Create admin user
try {
error.value = '';
loading.value = true;
// Register admin user
await auth.register(adminUser.username, adminUser.password, true);
// Continue to next step
loading.value = false;
step.value++;
return;
} catch (err) {
error.value = err.message || 'Error creating admin user';
loading.value = false;
return;
}
}
else if (step.value === 1) {
if (!selectedService.value) {
error.value = 'Please select a notification service';
return;
}
// Validate selected service
if (selectedService.value === 'ntfy' && !settings.ntfy_url) {
error.value = 'Please enter the NTFY URL';
return;
} else if (selectedService.value === 'ntfy' && (!settings.ntfy_username || !settings.ntfy_password)) {
error.value = 'Please enter both NTFY username and password';
return;
} else if (selectedService.value === 'discord' && !settings.discord_webhook_url) {
error.value = 'Please enter the Discord webhook URL';
return;
} else if (selectedService.value === 'slack' && !settings.slack_webhook_url) {
error.value = 'Please enter the Slack webhook URL';
return;
} else if (selectedService.value === 'gotify' && (!settings.gotify_url || !settings.gotify_token)) {
error.value = 'Please enter both Gotify URL and token';
return;
}
}
// Reset error and proceed to next step
error.value = '';
step.value++;
}
// Function to save settings
async function saveSettings() {
try {
loading.value = true;
// Prepare settings
const now = new Date().toISOString();
const settingsData = {
...settings,
last_updated: now
};
// Format NTFY auth if credentials are provided
if (selectedService.value === 'ntfy' && settings.ntfy_username && settings.ntfy_password) {
// Create auth string in the format expected by the backend
const authString = `${settings.ntfy_username}:${settings.ntfy_password}`;
settingsData.auth = authString;
}
// Send settings to server
const response = await fetch('/settings', {
method: 'PUT',
headers: {
'Content-Type': 'application/json',
'Authorization': auth.token.value
},
body: JSON.stringify(settingsData)
});
if (!response.ok) {
const data = await response.json();
throw new Error(data.message || 'Error saving settings');
}
// Redirect to main page
router.push('/');
} catch (err) {
error.value = err.message || 'An error occurred while saving settings';
} finally {
loading.value = false;
}
}
// Function to load existing settings
async function loadExistingSettings() {
try {
if (!auth.isAuthenticated.value) return;
const response = await fetch('/settings', {
headers: {
'Authorization': auth.token.value
}
});
if (response.ok) {
const data = await response.json();
if (data.success && data.data) {
// Populate settings with existing values
const existingSettings = data.data;
// Update notification service selection
if (existingSettings.ntfy_url) {
selectedService.value = 'ntfy';
settings.ntfy_url = existingSettings.ntfy_url;
// Parse auth string if it exists (format: username:password)
if (existingSettings.auth) {
const authParts = existingSettings.auth.split(':');
if (authParts.length === 2) {
settings.ntfy_username = authParts[0];
settings.ntfy_password = authParts[1];
}
}
} else if (existingSettings.discord_webhook_url) {
selectedService.value = 'discord';
settings.discord_webhook_url = existingSettings.discord_webhook_url;
} else if (existingSettings.slack_webhook_url) {
selectedService.value = 'slack';
settings.slack_webhook_url = existingSettings.slack_webhook_url;
} else if (existingSettings.gotify_url) {
selectedService.value = 'gotify';
settings.gotify_url = existingSettings.gotify_url;
settings.gotify_token = existingSettings.gotify_token;
}
// Update other settings
settings.github_token = existingSettings.github_token || '';
settings.docker_username = existingSettings.docker_username || '';
settings.docker_password = existingSettings.docker_password || '';
settings.check_interval = existingSettings.check_interval || 3600;
}
}
} catch (err) {
console.error('Error loading existing settings:', err);
}
}
</script>

294
web/pages/settings.vue Normal file
View File

@@ -0,0 +1,294 @@
<template>
<div>
<AppHeader />
<div class="container mx-auto px-4 py-8">
<h1 class="text-2xl font-bold text-white mb-8">Settings</h1>
<UCard class="mb-8">
<template #header>
<div class="flex justify-between items-center">
<h2 class="text-xl font-semibold">Notification Services</h2>
</div>
</template>
<div class="space-y-6">
<!-- NTFY -->
<div>
<h3 class="text-lg font-medium mb-2">NTFY</h3>
<div class="space-y-2">
<UInput
v-model="settings.ntfy_url"
label="NTFY URL"
placeholder="https://ntfy.sh/your-topic"
class="w-full"
/>
<UInput
v-model="settings.ntfy_username"
label="NTFY Username"
placeholder="username"
class="w-full"
/>
<UInput
v-model="settings.ntfy_password"
label="NTFY Password"
type="password"
placeholder="********"
class="w-full"
/>
<p class="mt-1 text-xs text-gray-500">
Username and password will be used to generate the auth.txt file
</p>
</div>
</div>
<!-- Discord -->
<div>
<h3 class="text-lg font-medium mb-2">Discord</h3>
<UInput
v-model="settings.discord_webhook_url"
label="Discord Webhook URL"
placeholder="https://discord.com/api/webhooks/..."
class="w-full"
/>
</div>
<!-- Slack -->
<div>
<h3 class="text-lg font-medium mb-2">Slack</h3>
<UInput
v-model="settings.slack_webhook_url"
label="Slack Webhook URL"
placeholder="https://hooks.slack.com/services/..."
class="w-full"
/>
</div>
<!-- Gotify -->
<div>
<h3 class="text-lg font-medium mb-2">Gotify</h3>
<div class="space-y-2">
<UInput
v-model="settings.gotify_url"
label="Gotify URL"
placeholder="https://gotify.example.com"
class="w-full"
/>
<UInput
v-model="settings.gotify_token"
label="Gotify Token"
placeholder="Axxxxxxxxx.xxxxx"
class="w-full"
/>
</div>
</div>
</div>
</UCard>
<UCard class="mb-8">
<template #header>
<div class="flex justify-between items-center">
<h2 class="text-xl font-semibold">GitHub</h2>
</div>
</template>
<div>
<UInput
v-model="settings.github_token"
label="GitHub Token (optional)"
placeholder="ghp_xxxxxxxxxxxxxxxx"
class="w-full"
/>
<p class="mt-1 text-xs text-gray-500">
A GitHub token helps avoid API rate limits for private repositories
</p>
</div>
</UCard>
<UCard class="mb-8">
<template #header>
<div class="flex justify-between items-center">
<h2 class="text-xl font-semibold">Docker Hub</h2>
</div>
</template>
<div class="space-y-4">
<UInput
v-model="settings.docker_username"
label="Docker Hub Username (optional)"
placeholder="username"
class="w-full"
/>
<UInput
v-model="settings.docker_password"
label="Docker Hub Password (optional)"
type="password"
placeholder="********"
class="w-full"
/>
<p class="mt-1 text-xs text-gray-500">
Docker Hub credentials allow access to private images
</p>
</div>
</UCard>
<UCard class="mb-8">
<template #header>
<div class="flex justify-between items-center">
<h2 class="text-xl font-semibold">Advanced Settings</h2>
</div>
</template>
<div>
<UInput
v-model="settings.check_interval"
label="Check Interval (seconds)"
type="number"
min="60"
placeholder="3600"
class="w-full"
/>
<p class="mt-1 text-xs text-gray-500">
Default interval is 3600 seconds (1 hour)
</p>
</div>
</UCard>
<div v-if="error" class="p-3 mb-6 text-sm text-red-500 bg-red-100 rounded-md">
{{ error }}
</div>
<div v-if="success" class="p-3 mb-6 text-sm text-green-500 bg-green-100 rounded-md">
{{ success }}
</div>
<div class="flex justify-end">
<UButton
@click="saveSettings"
color="primary"
:loading="loading"
>
Save Changes
</UButton>
</div>
</div>
<AppFooter />
</div>
</template>
<script setup>
const auth = useAuth();
const router = useRouter();
// Check if user is authenticated
onMounted(async () => {
if (!auth.isAuthenticated.value) {
return router.push('/login');
}
// Load current settings
await loadSettings();
});
const settings = reactive({
ntfy_url: '',
ntfy_username: '',
ntfy_password: '',
github_token: '',
docker_username: '',
docker_password: '',
gotify_url: '',
gotify_token: '',
discord_webhook_url: '',
slack_webhook_url: '',
check_interval: 3600
});
const error = ref('');
const success = ref('');
const loading = ref(false);
// Load current settings
async function loadSettings() {
try {
loading.value = true;
const response = await fetch('/settings', {
method: 'GET',
headers: {
'Authorization': auth.token.value
}
});
if (!response.ok) {
const data = await response.json();
throw new Error(data.message || 'Error loading settings');
}
const data = await response.json();
if (data.success && data.data) {
// Update settings with loaded values
Object.assign(settings, data.data);
// Parse NTFY auth string if it exists
if (data.data.auth) {
const authParts = data.data.auth.split(':');
if (authParts.length === 2) {
settings.ntfy_username = authParts[0];
settings.ntfy_password = authParts[1];
}
}
}
} catch (err) {
error.value = err.message || 'An error occurred while loading settings';
} finally {
loading.value = false;
}
}
// Function to save settings
async function saveSettings() {
try {
loading.value = true;
error.value = '';
success.value = '';
// Prepare settings
const now = new Date().toISOString();
const settingsData = {
...settings,
last_updated: now
};
// Format NTFY auth if credentials are provided
if (settings.ntfy_url && settings.ntfy_username && settings.ntfy_password) {
// Create auth string in the format expected by the backend
const authString = `${settings.ntfy_username}:${settings.ntfy_password}`;
settingsData.auth = authString;
}
// Send settings to server
const response = await fetch('/settings', {
method: 'PUT',
headers: {
'Content-Type': 'application/json',
'Authorization': auth.token.value
},
body: JSON.stringify(settingsData)
});
if (!response.ok) {
const data = await response.json();
throw new Error(data.message || 'Error saving settings');
}
success.value = 'Settings updated successfully';
} catch (err) {
error.value = err.message || 'An error occurred while saving settings';
} finally {
loading.value = false;
}
}
</script>

24
web/plugins/auth.js Normal file
View File

@@ -0,0 +1,24 @@
// Authentication verification plugin
export default defineNuxtPlugin(() => {
console.log('Authentication plugin loaded');
addRouteMiddleware('auth', (to) => {
console.log('Auth middleware executed for route:', to.path);
if (to.path === '/login' || to.path === '/onboarding') {
return;
}
if (process.client) {
const token = localStorage.getItem('token');
const user = localStorage.getItem('user');
console.log('Authentication check:', !!token, !!user);
if (!token || !user) {
console.log('Redirecting to /login');
return navigateTo('/login');
}
}
}, { global: true });
});

8486
web/pnpm-lock.yaml generated Normal file

File diff suppressed because it is too large Load Diff

BIN
web/public/favicon.ico Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.2 KiB

2
web/public/robots.txt Normal file
View File

@@ -0,0 +1,2 @@
User-Agent: *
Disallow:

3
web/server/tsconfig.json Normal file
View File

@@ -0,0 +1,3 @@
{
"extends": "../.nuxt/tsconfig.server.json"
}

20
web/tailwind.config.js Normal file
View File

@@ -0,0 +1,20 @@
/** @type {import('tailwindcss').Config} */
export default {
content: [
"./components/**/*.{js,vue,ts}",
"./layouts/**/*.vue",
"./pages/**/*.vue",
"./plugins/**/*.{js,ts}",
"./app.vue",
"./node_modules/@nuxt/ui/dist/**/*.{mjs,js,vue}"
],
theme: {
extend: {
colors: {
'emerald-950': '#23453d'
}
},
},
plugins: [],
}

4
web/tsconfig.json Normal file
View File

@@ -0,0 +1,4 @@
{
// https://nuxt.com/docs/guide/concepts/typescript
"extends": "./.nuxt/tsconfig.json"
}