mirror of https://github.com/immich-app/immich.git
feat(server,web): libraries (#3124)
* feat: libraries Co-authored-by: Jason Rasmussen <jrasm91@gmail.com> Co-authored-by: Alex <alex.tran1502@gmail.com>pull/4160/head
parent
816db700e1
commit
acdc66413c
File diff suppressed because it is too large
Load Diff
@ -0,0 +1,148 @@
|
||||
# Libraries
|
||||
|
||||
## Overview
|
||||
|
||||
Immich supports the creation of libraries which is a top-level asset container. Currently, there are two types of libraries: traditional upload libraries that can sync with a mobile device, and external libraries, that keeps up to date with files on disk. Libraries are different from albums in that an asset can belong to multiple albums but only one library, and deleting a library deletes all assets contained within. As of August 2023, this is a new feature and libraries have a lot of potential for future development beyond what is documented here. This document attempts to describe the current state of libraries.
|
||||
|
||||
## The Upload Library
|
||||
|
||||
Immich comes preconfigured with an upload library for each user. All assets uploaded to Immich are added to this library. This library can be renamed, but not deleted. The upload library is the only library that can be synced with a mobile device. No items in an upload library is allowed to have the same sha1 hash as another item in the same library in order to prevent duplicates.
|
||||
|
||||
## External Libraries
|
||||
|
||||
External libraries tracks assets stored outside of immich, i.e. in the file system. Immich will only read data from the files, and will not modify them in any way. Therefore, the delete button is disabled for external assets. When the external library is scanned, immich will read the metadata from the file and create an asset in the library for each image or video file. These items will then be shown in the main timeline, and they will look and behave like any other asset, including viewing on the map, adding to albums, etc.
|
||||
|
||||
If a file is modified outside of Immich, the changes will not be reflected in immich until the library is scanned again. There are different ways to scan a library depending on the use case:
|
||||
|
||||
- Scan Library Files: This is the default scan method and also the quickest. It will scan all files in the library and add new files to the library. It will notice if any files are missing (see below) but not check existing assets
|
||||
- Scan All Library Files: Same as above, but will check each existing asset to see if the modification time has changed. If it has, the asset will be updated. Since it has to check each asset, this is slower than Scan Library Files.
|
||||
- Force Scan All Library Files: Same as above, but will read each asset from disk no matter the modification time. This is useful in some cases where an asset has been modified externally but the modification time has not changed. This is the slowest way to scan because it reads each asset from disk.
|
||||
|
||||
:::caution
|
||||
|
||||
Due to aggressive caching it can take some time for a refreshed asset to appear correctly in the web view. You need to clear the cache in your browser to see the changes. This is a known issue and will be fixed in a future release. In Chrome, you need to open the developer console with F12, then reload the page with F5, and finally right click on the reload button and select "Empty Cache and Hard Reload".
|
||||
|
||||
:::
|
||||
|
||||
In external libraries, the file path is used for duplicate detection. This means that if a file is moved to a different location, it will be added as a new asset. If the file is moved back to its original location, it will be added as a new asset. In contrast to upload libraries, two identical files can be uploaded if they are in different locations. This is a deliberate design choice to make Immich reflect the file system as closely as possible. Remember that duplication detection is only done within the same library, so if you have multiple external libraries, the same file can be added to multiple libraries.
|
||||
|
||||
:::caution
|
||||
|
||||
If you add assets from an external library to an album and then move the asset to another location within the library, the asset will be removed from the album upon rescan. This is because the asset is considered a new asset after the move. This is a known issue and will be fixed in a future release.
|
||||
|
||||
:::
|
||||
|
||||
### Deleted External Assets
|
||||
|
||||
In all above scan methods, Immich will check if any files are missing. This can happen if files are deleted, or if they are on a storage location that is currently unavailable, like a network drive that is not mounted, or a USB drive that has been unplugged. In order to prevent accidental deletion of assets, Immich will not immediately delete an asset from the library if the file is missing. Instead, the asset will be internally marked as offline and will still be visible in the main timeline. If the file is moved back to its original location and the library is scanned again, the asset will be restored.
|
||||
|
||||
Finally, files can be deleted from Immich via the `Remove Offline Files` job. Any assets marked as offline will then be removed from Immich. Run this job whenever files have been deleted from the file system and you want to remove them from Immich. Note that a library scan must be performed first to mark the assets as offline.
|
||||
|
||||
### Import Paths
|
||||
|
||||
External libraries use import paths to determine which files to scan. Each library can have multiple import paths so that files from different locations can be added to the same library. Import paths are scanned recursively, and if a file is in multiple import paths, it will only be added once. If the import paths are edited in a way that an external file is no longer in any import path, it will be removed from the library in the same way a deleted file would. If the file is moved back to an import path, it will be added again as if it was a new file.
|
||||
|
||||
### Security Considerations
|
||||
|
||||
For security purposes, each Immich user is disallowed to add external files by default. This is to prevent devastating [path traversal attacks](https://owasp.org/www-community/attacks/Path_Traversal). An admin can allow individual users to use external path feature via the `external path` setting found in the admin panel. Without the external path restriction, a user can add any image or video file on the Immich host filesystem to be imported into Immich, potentially allowing sensitive data to be accessed. If you are running Immich as root in your Docker setup (which is the default), all external file reads are done with root privileges. This is particularly dangerous if the Immich host is a shared server.
|
||||
|
||||
With the `external path` set, a user is restricted to accessing external files to files or directories within that path. The Immich admin should still be careful not set the external path too generously. For example, `user1` wants to read their photos in to `/home/user1`. A lazy admin sets that user's external path to `/home/` since it "gets the job done". However, that user will then be able to read all photos in `/home/user2/private-photos`, too! Please set the external path as specific as possible. If multiple folders must be added, do this using the docker volume mount feature described below.
|
||||
|
||||
### Exclusion Patterns and Scan Settings
|
||||
|
||||
By default, all files in the import paths will be added to the library. If there are files that should not be added, exclusion patterns can be used to exclude them. Exclusion patterns are glob patterns are matched against the full file path. If a file matches an exclusion pattern, it will not be added to the library. Exclusion patterns can be added in the Scan Settings page for each library. Under the hood, Immich uses the [glob](https://www.npmjs.com/package/glob) package to match patterns, so please refer to [their documentation](https://github.com/isaacs/node-glob#glob-primer) to see what patterns are supported.
|
||||
|
||||
Some basic examples:
|
||||
|
||||
- `*.tif` will exclude all files with the extension `.tif`
|
||||
- `hidden.jpg` will exclude all files named `hidden.jpg`
|
||||
- `**/Raw/**` will exclude all files in any directory named `Raw`
|
||||
- `*.(tif,jpg)` will exclude all files with the extension `.tif` or `.jpg`
|
||||
|
||||
## Usage
|
||||
|
||||
Let's show a concrete example where we add an existing gallery to Immich. Here, we have the following folders we want to add:
|
||||
|
||||
- `/home/user/old-pics`: a folder contining childhood photos.
|
||||
- `/mnt/nas/christmas-trip`: photos from a christmas trip. The subfolder `/mnt/nas/christmas-trip/Raw` contains the raw files directly from the DSLR. We don't want to import the raw files to Immich
|
||||
- `/mnt/media/videos`: Videos from the same christmas trip.
|
||||
|
||||
First, we need to plan how we want to organize the libraries. The christmas trip photos should belong to its own library since we want to exclude the raw files. The videos and old photos can be in the same library since we want to import all files. We could also add all three folders to the same library if there are no files matching the Raw exclusion pattern in the other folders.
|
||||
|
||||
### Mount Docker Volumes
|
||||
|
||||
`immich-server` and `immich-microservices` containers will need access to the gallery. Modify your docker compose file as follows
|
||||
|
||||
```diff title="docker-compose.yml"
|
||||
immich-server:
|
||||
volumes:
|
||||
- ${UPLOAD_LOCATION}:/usr/src/app/upload
|
||||
+ - /mnt/nas/christmas-trip:/mnt/media/christmas-trip:ro
|
||||
+ - /home/user/old-pics:/mnt/media/old-pics:ro
|
||||
+ - /mnt/media/videos:/mnt/media/videos:ro
|
||||
|
||||
|
||||
immich-microservices:
|
||||
volumes:
|
||||
- ${UPLOAD_LOCATION}:/usr/src/app/upload
|
||||
+ - /mnt/nas/christmas-trip:/mnt/media/christmas-trip:ro
|
||||
+ - /home/user/old-pics:/mnt/media/old-pics:ro
|
||||
+ - /mnt/media/videos:/mnt/media/videos:ro
|
||||
```
|
||||
|
||||
:::tip
|
||||
The `ro` flag at the end only gives read-only access to the volumes. While Immich does not modify files, it's a good practice to mount read-only.
|
||||
:::
|
||||
|
||||
_Remember to bring the container down/up to register the changes. Make sure you can see the mounted path in the container._
|
||||
|
||||
### Set External Path
|
||||
|
||||
Only an admin can do this.
|
||||
|
||||
- Navigate to `Administration > Users` page on the web.
|
||||
- Click on the user edit button.
|
||||
- Set `/mnt/media` to be the external path. This folder will only contain the three folders that we want to import, so nothing else can be accessed.
|
||||
|
||||
### Create External Libraries
|
||||
|
||||
- Click on your user name in the top right corner -> Account Settings
|
||||
- Click on Libraries
|
||||
- Click on Create External Library
|
||||
- Click the drop-down menu on the newly created library
|
||||
- Click on Rename Library and rename it to "Christmas Trip"
|
||||
- Click Edit Import Paths
|
||||
- Click on Add Path
|
||||
- Enter `/mnt/media/christmas-trip` then click Add
|
||||
|
||||
NOTE: We have to use the `/mnt/media/christmas-trip` path and not the `/mnt/nas/christmas-trip` path since all paths have to be what the Docker containers see.
|
||||
|
||||
Next, we'll add an exclusion pattern to filter out raw files.
|
||||
|
||||
- Click the drop-down menu on the newly christmas library
|
||||
- Click on Manage
|
||||
- Click on Scan Settings
|
||||
- Click on Add Exclusion Pattern
|
||||
- Enter `**/Raw/**` and click save.
|
||||
- Click save
|
||||
- Click the drop-down menu on the newly created library
|
||||
- Click on Scan Library Files
|
||||
|
||||
The christmas trip library will now be scanned in the background. In the meantime, let's add the videos and old photos to another library.
|
||||
|
||||
- Click on Create External Library.
|
||||
|
||||
:::info Note
|
||||
If you get an error here, please rename the other external library to something else. This is a bug that will be fixed in a future release.
|
||||
:::
|
||||
|
||||
- Click the drop-down menu on the newly created library
|
||||
- Click Edit Import Paths
|
||||
- Click on Add Path
|
||||
- Enter `/mnt/media/old-pics` then click Add
|
||||
- Click on Add Path
|
||||
- Enter `/mnt/media/videos` then click Add
|
||||
- Click Save
|
||||
- Click on Scan Library Files
|
||||
|
||||
Within seconds, the assets from the old-pics and videos folders should show up in the main timeline.
|
||||
@ -0,0 +1,19 @@
|
||||
# openapi.model.CreateLibraryDto
|
||||
|
||||
## Load the model package
|
||||
```dart
|
||||
import 'package:openapi/api.dart';
|
||||
```
|
||||
|
||||
## Properties
|
||||
Name | Type | Description | Notes
|
||||
------------ | ------------- | ------------- | -------------
|
||||
**exclusionPatterns** | **List<String>** | | [optional] [default to const []]
|
||||
**importPaths** | **List<String>** | | [optional] [default to const []]
|
||||
**isVisible** | **bool** | | [optional]
|
||||
**name** | **String** | | [optional]
|
||||
**type** | [**LibraryType**](LibraryType.md) | |
|
||||
|
||||
[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
|
||||
|
||||
|
||||
@ -0,0 +1,458 @@
|
||||
# openapi.api.LibraryApi
|
||||
|
||||
## Load the API package
|
||||
```dart
|
||||
import 'package:openapi/api.dart';
|
||||
```
|
||||
|
||||
All URIs are relative to */api*
|
||||
|
||||
Method | HTTP request | Description
|
||||
------------- | ------------- | -------------
|
||||
[**createLibrary**](LibraryApi.md#createlibrary) | **POST** /library |
|
||||
[**deleteLibrary**](LibraryApi.md#deletelibrary) | **DELETE** /library/{id} |
|
||||
[**getAllForUser**](LibraryApi.md#getallforuser) | **GET** /library |
|
||||
[**getLibraryInfo**](LibraryApi.md#getlibraryinfo) | **GET** /library/{id} |
|
||||
[**getLibraryStatistics**](LibraryApi.md#getlibrarystatistics) | **GET** /library/{id}/statistics |
|
||||
[**removeOfflineFiles**](LibraryApi.md#removeofflinefiles) | **POST** /library/{id}/removeOffline |
|
||||
[**scanLibrary**](LibraryApi.md#scanlibrary) | **POST** /library/{id}/scan |
|
||||
[**updateLibrary**](LibraryApi.md#updatelibrary) | **PUT** /library/{id} |
|
||||
|
||||
|
||||
# **createLibrary**
|
||||
> LibraryResponseDto createLibrary(createLibraryDto)
|
||||
|
||||
|
||||
|
||||
### Example
|
||||
```dart
|
||||
import 'package:openapi/api.dart';
|
||||
// TODO Configure API key authorization: cookie
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('cookie').apiKey = 'YOUR_API_KEY';
|
||||
// uncomment below to setup prefix (e.g. Bearer) for API key, if needed
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('cookie').apiKeyPrefix = 'Bearer';
|
||||
// TODO Configure API key authorization: api_key
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('api_key').apiKey = 'YOUR_API_KEY';
|
||||
// uncomment below to setup prefix (e.g. Bearer) for API key, if needed
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('api_key').apiKeyPrefix = 'Bearer';
|
||||
// TODO Configure HTTP Bearer authorization: bearer
|
||||
// Case 1. Use String Token
|
||||
//defaultApiClient.getAuthentication<HttpBearerAuth>('bearer').setAccessToken('YOUR_ACCESS_TOKEN');
|
||||
// Case 2. Use Function which generate token.
|
||||
// String yourTokenGeneratorFunction() { ... }
|
||||
//defaultApiClient.getAuthentication<HttpBearerAuth>('bearer').setAccessToken(yourTokenGeneratorFunction);
|
||||
|
||||
final api_instance = LibraryApi();
|
||||
final createLibraryDto = CreateLibraryDto(); // CreateLibraryDto |
|
||||
|
||||
try {
|
||||
final result = api_instance.createLibrary(createLibraryDto);
|
||||
print(result);
|
||||
} catch (e) {
|
||||
print('Exception when calling LibraryApi->createLibrary: $e\n');
|
||||
}
|
||||
```
|
||||
|
||||
### Parameters
|
||||
|
||||
Name | Type | Description | Notes
|
||||
------------- | ------------- | ------------- | -------------
|
||||
**createLibraryDto** | [**CreateLibraryDto**](CreateLibraryDto.md)| |
|
||||
|
||||
### Return type
|
||||
|
||||
[**LibraryResponseDto**](LibraryResponseDto.md)
|
||||
|
||||
### Authorization
|
||||
|
||||
[cookie](../README.md#cookie), [api_key](../README.md#api_key), [bearer](../README.md#bearer)
|
||||
|
||||
### HTTP request headers
|
||||
|
||||
- **Content-Type**: application/json
|
||||
- **Accept**: application/json
|
||||
|
||||
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
|
||||
|
||||
# **deleteLibrary**
|
||||
> deleteLibrary(id)
|
||||
|
||||
|
||||
|
||||
### Example
|
||||
```dart
|
||||
import 'package:openapi/api.dart';
|
||||
// TODO Configure API key authorization: cookie
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('cookie').apiKey = 'YOUR_API_KEY';
|
||||
// uncomment below to setup prefix (e.g. Bearer) for API key, if needed
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('cookie').apiKeyPrefix = 'Bearer';
|
||||
// TODO Configure API key authorization: api_key
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('api_key').apiKey = 'YOUR_API_KEY';
|
||||
// uncomment below to setup prefix (e.g. Bearer) for API key, if needed
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('api_key').apiKeyPrefix = 'Bearer';
|
||||
// TODO Configure HTTP Bearer authorization: bearer
|
||||
// Case 1. Use String Token
|
||||
//defaultApiClient.getAuthentication<HttpBearerAuth>('bearer').setAccessToken('YOUR_ACCESS_TOKEN');
|
||||
// Case 2. Use Function which generate token.
|
||||
// String yourTokenGeneratorFunction() { ... }
|
||||
//defaultApiClient.getAuthentication<HttpBearerAuth>('bearer').setAccessToken(yourTokenGeneratorFunction);
|
||||
|
||||
final api_instance = LibraryApi();
|
||||
final id = 38400000-8cf0-11bd-b23e-10b96e4ef00d; // String |
|
||||
|
||||
try {
|
||||
api_instance.deleteLibrary(id);
|
||||
} catch (e) {
|
||||
print('Exception when calling LibraryApi->deleteLibrary: $e\n');
|
||||
}
|
||||
```
|
||||
|
||||
### Parameters
|
||||
|
||||
Name | Type | Description | Notes
|
||||
------------- | ------------- | ------------- | -------------
|
||||
**id** | **String**| |
|
||||
|
||||
### Return type
|
||||
|
||||
void (empty response body)
|
||||
|
||||
### Authorization
|
||||
|
||||
[cookie](../README.md#cookie), [api_key](../README.md#api_key), [bearer](../README.md#bearer)
|
||||
|
||||
### HTTP request headers
|
||||
|
||||
- **Content-Type**: Not defined
|
||||
- **Accept**: Not defined
|
||||
|
||||
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
|
||||
|
||||
# **getAllForUser**
|
||||
> List<LibraryResponseDto> getAllForUser()
|
||||
|
||||
|
||||
|
||||
### Example
|
||||
```dart
|
||||
import 'package:openapi/api.dart';
|
||||
// TODO Configure API key authorization: cookie
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('cookie').apiKey = 'YOUR_API_KEY';
|
||||
// uncomment below to setup prefix (e.g. Bearer) for API key, if needed
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('cookie').apiKeyPrefix = 'Bearer';
|
||||
// TODO Configure API key authorization: api_key
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('api_key').apiKey = 'YOUR_API_KEY';
|
||||
// uncomment below to setup prefix (e.g. Bearer) for API key, if needed
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('api_key').apiKeyPrefix = 'Bearer';
|
||||
// TODO Configure HTTP Bearer authorization: bearer
|
||||
// Case 1. Use String Token
|
||||
//defaultApiClient.getAuthentication<HttpBearerAuth>('bearer').setAccessToken('YOUR_ACCESS_TOKEN');
|
||||
// Case 2. Use Function which generate token.
|
||||
// String yourTokenGeneratorFunction() { ... }
|
||||
//defaultApiClient.getAuthentication<HttpBearerAuth>('bearer').setAccessToken(yourTokenGeneratorFunction);
|
||||
|
||||
final api_instance = LibraryApi();
|
||||
|
||||
try {
|
||||
final result = api_instance.getAllForUser();
|
||||
print(result);
|
||||
} catch (e) {
|
||||
print('Exception when calling LibraryApi->getAllForUser: $e\n');
|
||||
}
|
||||
```
|
||||
|
||||
### Parameters
|
||||
This endpoint does not need any parameter.
|
||||
|
||||
### Return type
|
||||
|
||||
[**List<LibraryResponseDto>**](LibraryResponseDto.md)
|
||||
|
||||
### Authorization
|
||||
|
||||
[cookie](../README.md#cookie), [api_key](../README.md#api_key), [bearer](../README.md#bearer)
|
||||
|
||||
### HTTP request headers
|
||||
|
||||
- **Content-Type**: Not defined
|
||||
- **Accept**: application/json
|
||||
|
||||
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
|
||||
|
||||
# **getLibraryInfo**
|
||||
> LibraryResponseDto getLibraryInfo(id)
|
||||
|
||||
|
||||
|
||||
### Example
|
||||
```dart
|
||||
import 'package:openapi/api.dart';
|
||||
// TODO Configure API key authorization: cookie
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('cookie').apiKey = 'YOUR_API_KEY';
|
||||
// uncomment below to setup prefix (e.g. Bearer) for API key, if needed
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('cookie').apiKeyPrefix = 'Bearer';
|
||||
// TODO Configure API key authorization: api_key
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('api_key').apiKey = 'YOUR_API_KEY';
|
||||
// uncomment below to setup prefix (e.g. Bearer) for API key, if needed
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('api_key').apiKeyPrefix = 'Bearer';
|
||||
// TODO Configure HTTP Bearer authorization: bearer
|
||||
// Case 1. Use String Token
|
||||
//defaultApiClient.getAuthentication<HttpBearerAuth>('bearer').setAccessToken('YOUR_ACCESS_TOKEN');
|
||||
// Case 2. Use Function which generate token.
|
||||
// String yourTokenGeneratorFunction() { ... }
|
||||
//defaultApiClient.getAuthentication<HttpBearerAuth>('bearer').setAccessToken(yourTokenGeneratorFunction);
|
||||
|
||||
final api_instance = LibraryApi();
|
||||
final id = 38400000-8cf0-11bd-b23e-10b96e4ef00d; // String |
|
||||
|
||||
try {
|
||||
final result = api_instance.getLibraryInfo(id);
|
||||
print(result);
|
||||
} catch (e) {
|
||||
print('Exception when calling LibraryApi->getLibraryInfo: $e\n');
|
||||
}
|
||||
```
|
||||
|
||||
### Parameters
|
||||
|
||||
Name | Type | Description | Notes
|
||||
------------- | ------------- | ------------- | -------------
|
||||
**id** | **String**| |
|
||||
|
||||
### Return type
|
||||
|
||||
[**LibraryResponseDto**](LibraryResponseDto.md)
|
||||
|
||||
### Authorization
|
||||
|
||||
[cookie](../README.md#cookie), [api_key](../README.md#api_key), [bearer](../README.md#bearer)
|
||||
|
||||
### HTTP request headers
|
||||
|
||||
- **Content-Type**: Not defined
|
||||
- **Accept**: application/json
|
||||
|
||||
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
|
||||
|
||||
# **getLibraryStatistics**
|
||||
> LibraryStatsResponseDto getLibraryStatistics(id)
|
||||
|
||||
|
||||
|
||||
### Example
|
||||
```dart
|
||||
import 'package:openapi/api.dart';
|
||||
// TODO Configure API key authorization: cookie
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('cookie').apiKey = 'YOUR_API_KEY';
|
||||
// uncomment below to setup prefix (e.g. Bearer) for API key, if needed
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('cookie').apiKeyPrefix = 'Bearer';
|
||||
// TODO Configure API key authorization: api_key
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('api_key').apiKey = 'YOUR_API_KEY';
|
||||
// uncomment below to setup prefix (e.g. Bearer) for API key, if needed
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('api_key').apiKeyPrefix = 'Bearer';
|
||||
// TODO Configure HTTP Bearer authorization: bearer
|
||||
// Case 1. Use String Token
|
||||
//defaultApiClient.getAuthentication<HttpBearerAuth>('bearer').setAccessToken('YOUR_ACCESS_TOKEN');
|
||||
// Case 2. Use Function which generate token.
|
||||
// String yourTokenGeneratorFunction() { ... }
|
||||
//defaultApiClient.getAuthentication<HttpBearerAuth>('bearer').setAccessToken(yourTokenGeneratorFunction);
|
||||
|
||||
final api_instance = LibraryApi();
|
||||
final id = 38400000-8cf0-11bd-b23e-10b96e4ef00d; // String |
|
||||
|
||||
try {
|
||||
final result = api_instance.getLibraryStatistics(id);
|
||||
print(result);
|
||||
} catch (e) {
|
||||
print('Exception when calling LibraryApi->getLibraryStatistics: $e\n');
|
||||
}
|
||||
```
|
||||
|
||||
### Parameters
|
||||
|
||||
Name | Type | Description | Notes
|
||||
------------- | ------------- | ------------- | -------------
|
||||
**id** | **String**| |
|
||||
|
||||
### Return type
|
||||
|
||||
[**LibraryStatsResponseDto**](LibraryStatsResponseDto.md)
|
||||
|
||||
### Authorization
|
||||
|
||||
[cookie](../README.md#cookie), [api_key](../README.md#api_key), [bearer](../README.md#bearer)
|
||||
|
||||
### HTTP request headers
|
||||
|
||||
- **Content-Type**: Not defined
|
||||
- **Accept**: application/json
|
||||
|
||||
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
|
||||
|
||||
# **removeOfflineFiles**
|
||||
> removeOfflineFiles(id)
|
||||
|
||||
|
||||
|
||||
### Example
|
||||
```dart
|
||||
import 'package:openapi/api.dart';
|
||||
// TODO Configure API key authorization: cookie
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('cookie').apiKey = 'YOUR_API_KEY';
|
||||
// uncomment below to setup prefix (e.g. Bearer) for API key, if needed
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('cookie').apiKeyPrefix = 'Bearer';
|
||||
// TODO Configure API key authorization: api_key
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('api_key').apiKey = 'YOUR_API_KEY';
|
||||
// uncomment below to setup prefix (e.g. Bearer) for API key, if needed
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('api_key').apiKeyPrefix = 'Bearer';
|
||||
// TODO Configure HTTP Bearer authorization: bearer
|
||||
// Case 1. Use String Token
|
||||
//defaultApiClient.getAuthentication<HttpBearerAuth>('bearer').setAccessToken('YOUR_ACCESS_TOKEN');
|
||||
// Case 2. Use Function which generate token.
|
||||
// String yourTokenGeneratorFunction() { ... }
|
||||
//defaultApiClient.getAuthentication<HttpBearerAuth>('bearer').setAccessToken(yourTokenGeneratorFunction);
|
||||
|
||||
final api_instance = LibraryApi();
|
||||
final id = 38400000-8cf0-11bd-b23e-10b96e4ef00d; // String |
|
||||
|
||||
try {
|
||||
api_instance.removeOfflineFiles(id);
|
||||
} catch (e) {
|
||||
print('Exception when calling LibraryApi->removeOfflineFiles: $e\n');
|
||||
}
|
||||
```
|
||||
|
||||
### Parameters
|
||||
|
||||
Name | Type | Description | Notes
|
||||
------------- | ------------- | ------------- | -------------
|
||||
**id** | **String**| |
|
||||
|
||||
### Return type
|
||||
|
||||
void (empty response body)
|
||||
|
||||
### Authorization
|
||||
|
||||
[cookie](../README.md#cookie), [api_key](../README.md#api_key), [bearer](../README.md#bearer)
|
||||
|
||||
### HTTP request headers
|
||||
|
||||
- **Content-Type**: Not defined
|
||||
- **Accept**: Not defined
|
||||
|
||||
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
|
||||
|
||||
# **scanLibrary**
|
||||
> scanLibrary(id, scanLibraryDto)
|
||||
|
||||
|
||||
|
||||
### Example
|
||||
```dart
|
||||
import 'package:openapi/api.dart';
|
||||
// TODO Configure API key authorization: cookie
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('cookie').apiKey = 'YOUR_API_KEY';
|
||||
// uncomment below to setup prefix (e.g. Bearer) for API key, if needed
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('cookie').apiKeyPrefix = 'Bearer';
|
||||
// TODO Configure API key authorization: api_key
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('api_key').apiKey = 'YOUR_API_KEY';
|
||||
// uncomment below to setup prefix (e.g. Bearer) for API key, if needed
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('api_key').apiKeyPrefix = 'Bearer';
|
||||
// TODO Configure HTTP Bearer authorization: bearer
|
||||
// Case 1. Use String Token
|
||||
//defaultApiClient.getAuthentication<HttpBearerAuth>('bearer').setAccessToken('YOUR_ACCESS_TOKEN');
|
||||
// Case 2. Use Function which generate token.
|
||||
// String yourTokenGeneratorFunction() { ... }
|
||||
//defaultApiClient.getAuthentication<HttpBearerAuth>('bearer').setAccessToken(yourTokenGeneratorFunction);
|
||||
|
||||
final api_instance = LibraryApi();
|
||||
final id = 38400000-8cf0-11bd-b23e-10b96e4ef00d; // String |
|
||||
final scanLibraryDto = ScanLibraryDto(); // ScanLibraryDto |
|
||||
|
||||
try {
|
||||
api_instance.scanLibrary(id, scanLibraryDto);
|
||||
} catch (e) {
|
||||
print('Exception when calling LibraryApi->scanLibrary: $e\n');
|
||||
}
|
||||
```
|
||||
|
||||
### Parameters
|
||||
|
||||
Name | Type | Description | Notes
|
||||
------------- | ------------- | ------------- | -------------
|
||||
**id** | **String**| |
|
||||
**scanLibraryDto** | [**ScanLibraryDto**](ScanLibraryDto.md)| |
|
||||
|
||||
### Return type
|
||||
|
||||
void (empty response body)
|
||||
|
||||
### Authorization
|
||||
|
||||
[cookie](../README.md#cookie), [api_key](../README.md#api_key), [bearer](../README.md#bearer)
|
||||
|
||||
### HTTP request headers
|
||||
|
||||
- **Content-Type**: application/json
|
||||
- **Accept**: Not defined
|
||||
|
||||
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
|
||||
|
||||
# **updateLibrary**
|
||||
> LibraryResponseDto updateLibrary(id, updateLibraryDto)
|
||||
|
||||
|
||||
|
||||
### Example
|
||||
```dart
|
||||
import 'package:openapi/api.dart';
|
||||
// TODO Configure API key authorization: cookie
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('cookie').apiKey = 'YOUR_API_KEY';
|
||||
// uncomment below to setup prefix (e.g. Bearer) for API key, if needed
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('cookie').apiKeyPrefix = 'Bearer';
|
||||
// TODO Configure API key authorization: api_key
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('api_key').apiKey = 'YOUR_API_KEY';
|
||||
// uncomment below to setup prefix (e.g. Bearer) for API key, if needed
|
||||
//defaultApiClient.getAuthentication<ApiKeyAuth>('api_key').apiKeyPrefix = 'Bearer';
|
||||
// TODO Configure HTTP Bearer authorization: bearer
|
||||
// Case 1. Use String Token
|
||||
//defaultApiClient.getAuthentication<HttpBearerAuth>('bearer').setAccessToken('YOUR_ACCESS_TOKEN');
|
||||
// Case 2. Use Function which generate token.
|
||||
// String yourTokenGeneratorFunction() { ... }
|
||||
//defaultApiClient.getAuthentication<HttpBearerAuth>('bearer').setAccessToken(yourTokenGeneratorFunction);
|
||||
|
||||
final api_instance = LibraryApi();
|
||||
final id = 38400000-8cf0-11bd-b23e-10b96e4ef00d; // String |
|
||||
final updateLibraryDto = UpdateLibraryDto(); // UpdateLibraryDto |
|
||||
|
||||
try {
|
||||
final result = api_instance.updateLibrary(id, updateLibraryDto);
|
||||
print(result);
|
||||
} catch (e) {
|
||||
print('Exception when calling LibraryApi->updateLibrary: $e\n');
|
||||
}
|
||||
```
|
||||
|
||||
### Parameters
|
||||
|
||||
Name | Type | Description | Notes
|
||||
------------- | ------------- | ------------- | -------------
|
||||
**id** | **String**| |
|
||||
**updateLibraryDto** | [**UpdateLibraryDto**](UpdateLibraryDto.md)| |
|
||||
|
||||
### Return type
|
||||
|
||||
[**LibraryResponseDto**](LibraryResponseDto.md)
|
||||
|
||||
### Authorization
|
||||
|
||||
[cookie](../README.md#cookie), [api_key](../README.md#api_key), [bearer](../README.md#bearer)
|
||||
|
||||
### HTTP request headers
|
||||
|
||||
- **Content-Type**: application/json
|
||||
- **Accept**: application/json
|
||||
|
||||
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
|
||||
|
||||
@ -0,0 +1,24 @@
|
||||
# openapi.model.LibraryResponseDto
|
||||
|
||||
## Load the model package
|
||||
```dart
|
||||
import 'package:openapi/api.dart';
|
||||
```
|
||||
|
||||
## Properties
|
||||
Name | Type | Description | Notes
|
||||
------------ | ------------- | ------------- | -------------
|
||||
**assetCount** | **int** | |
|
||||
**createdAt** | [**DateTime**](DateTime.md) | |
|
||||
**exclusionPatterns** | **List<String>** | | [default to const []]
|
||||
**id** | **String** | |
|
||||
**importPaths** | **List<String>** | | [default to const []]
|
||||
**name** | **String** | |
|
||||
**ownerId** | **String** | |
|
||||
**refreshedAt** | [**DateTime**](DateTime.md) | |
|
||||
**type** | [**LibraryType**](LibraryType.md) | |
|
||||
**updatedAt** | [**DateTime**](DateTime.md) | |
|
||||
|
||||
[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
|
||||
|
||||
|
||||
@ -0,0 +1,18 @@
|
||||
# openapi.model.LibraryStatsResponseDto
|
||||
|
||||
## Load the model package
|
||||
```dart
|
||||
import 'package:openapi/api.dart';
|
||||
```
|
||||
|
||||
## Properties
|
||||
Name | Type | Description | Notes
|
||||
------------ | ------------- | ------------- | -------------
|
||||
**photos** | **int** | | [default to 0]
|
||||
**total** | **int** | | [default to 0]
|
||||
**usage** | **int** | | [default to 0]
|
||||
**videos** | **int** | | [default to 0]
|
||||
|
||||
[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
|
||||
|
||||
|
||||
@ -0,0 +1,14 @@
|
||||
# openapi.model.LibraryType
|
||||
|
||||
## Load the model package
|
||||
```dart
|
||||
import 'package:openapi/api.dart';
|
||||
```
|
||||
|
||||
## Properties
|
||||
Name | Type | Description | Notes
|
||||
------------ | ------------- | ------------- | -------------
|
||||
|
||||
[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
|
||||
|
||||
|
||||
@ -0,0 +1,16 @@
|
||||
# openapi.model.ScanLibraryDto
|
||||
|
||||
## Load the model package
|
||||
```dart
|
||||
import 'package:openapi/api.dart';
|
||||
```
|
||||
|
||||
## Properties
|
||||
Name | Type | Description | Notes
|
||||
------------ | ------------- | ------------- | -------------
|
||||
**refreshAllFiles** | **bool** | | [optional] [default to false]
|
||||
**refreshModifiedFiles** | **bool** | | [optional]
|
||||
|
||||
[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
|
||||
|
||||
|
||||
@ -0,0 +1,18 @@
|
||||
# openapi.model.UpdateLibraryDto
|
||||
|
||||
## Load the model package
|
||||
```dart
|
||||
import 'package:openapi/api.dart';
|
||||
```
|
||||
|
||||
## Properties
|
||||
Name | Type | Description | Notes
|
||||
------------ | ------------- | ------------- | -------------
|
||||
**exclusionPatterns** | **List<String>** | | [optional] [default to const []]
|
||||
**importPaths** | **List<String>** | | [optional] [default to const []]
|
||||
**isVisible** | **bool** | | [optional]
|
||||
**name** | **String** | | [optional]
|
||||
|
||||
[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
|
||||
|
||||
|
||||
@ -0,0 +1,381 @@
|
||||
//
|
||||
// AUTO-GENERATED FILE, DO NOT MODIFY!
|
||||
//
|
||||
// @dart=2.12
|
||||
|
||||
// ignore_for_file: unused_element, unused_import
|
||||
// ignore_for_file: always_put_required_named_parameters_first
|
||||
// ignore_for_file: constant_identifier_names
|
||||
// ignore_for_file: lines_longer_than_80_chars
|
||||
|
||||
part of openapi.api;
|
||||
|
||||
|
||||
class LibraryApi {
|
||||
LibraryApi([ApiClient? apiClient]) : apiClient = apiClient ?? defaultApiClient;
|
||||
|
||||
final ApiClient apiClient;
|
||||
|
||||
/// Performs an HTTP 'POST /library' operation and returns the [Response].
|
||||
/// Parameters:
|
||||
///
|
||||
/// * [CreateLibraryDto] createLibraryDto (required):
|
||||
Future<Response> createLibraryWithHttpInfo(CreateLibraryDto createLibraryDto,) async {
|
||||
// ignore: prefer_const_declarations
|
||||
final path = r'/library';
|
||||
|
||||
// ignore: prefer_final_locals
|
||||
Object? postBody = createLibraryDto;
|
||||
|
||||
final queryParams = <QueryParam>[];
|
||||
final headerParams = <String, String>{};
|
||||
final formParams = <String, String>{};
|
||||
|
||||
const contentTypes = <String>['application/json'];
|
||||
|
||||
|
||||
return apiClient.invokeAPI(
|
||||
path,
|
||||
'POST',
|
||||
queryParams,
|
||||
postBody,
|
||||
headerParams,
|
||||
formParams,
|
||||
contentTypes.isEmpty ? null : contentTypes.first,
|
||||
);
|
||||
}
|
||||
|
||||
/// Parameters:
|
||||
///
|
||||
/// * [CreateLibraryDto] createLibraryDto (required):
|
||||
Future<LibraryResponseDto?> createLibrary(CreateLibraryDto createLibraryDto,) async {
|
||||
final response = await createLibraryWithHttpInfo(createLibraryDto,);
|
||||
if (response.statusCode >= HttpStatus.badRequest) {
|
||||
throw ApiException(response.statusCode, await _decodeBodyBytes(response));
|
||||
}
|
||||
// When a remote server returns no body with a status of 204, we shall not decode it.
|
||||
// At the time of writing this, `dart:convert` will throw an "Unexpected end of input"
|
||||
// FormatException when trying to decode an empty string.
|
||||
if (response.body.isNotEmpty && response.statusCode != HttpStatus.noContent) {
|
||||
return await apiClient.deserializeAsync(await _decodeBodyBytes(response), 'LibraryResponseDto',) as LibraryResponseDto;
|
||||
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
/// Performs an HTTP 'DELETE /library/{id}' operation and returns the [Response].
|
||||
/// Parameters:
|
||||
///
|
||||
/// * [String] id (required):
|
||||
Future<Response> deleteLibraryWithHttpInfo(String id,) async {
|
||||
// ignore: prefer_const_declarations
|
||||
final path = r'/library/{id}'
|
||||
.replaceAll('{id}', id);
|
||||
|
||||
// ignore: prefer_final_locals
|
||||
Object? postBody;
|
||||
|
||||
final queryParams = <QueryParam>[];
|
||||
final headerParams = <String, String>{};
|
||||
final formParams = <String, String>{};
|
||||
|
||||
const contentTypes = <String>[];
|
||||
|
||||
|
||||
return apiClient.invokeAPI(
|
||||
path,
|
||||
'DELETE',
|
||||
queryParams,
|
||||
postBody,
|
||||
headerParams,
|
||||
formParams,
|
||||
contentTypes.isEmpty ? null : contentTypes.first,
|
||||
);
|
||||
}
|
||||
|
||||
/// Parameters:
|
||||
///
|
||||
/// * [String] id (required):
|
||||
Future<void> deleteLibrary(String id,) async {
|
||||
final response = await deleteLibraryWithHttpInfo(id,);
|
||||
if (response.statusCode >= HttpStatus.badRequest) {
|
||||
throw ApiException(response.statusCode, await _decodeBodyBytes(response));
|
||||
}
|
||||
}
|
||||
|
||||
/// Performs an HTTP 'GET /library' operation and returns the [Response].
|
||||
Future<Response> getAllForUserWithHttpInfo() async {
|
||||
// ignore: prefer_const_declarations
|
||||
final path = r'/library';
|
||||
|
||||
// ignore: prefer_final_locals
|
||||
Object? postBody;
|
||||
|
||||
final queryParams = <QueryParam>[];
|
||||
final headerParams = <String, String>{};
|
||||
final formParams = <String, String>{};
|
||||
|
||||
const contentTypes = <String>[];
|
||||
|
||||
|
||||
return apiClient.invokeAPI(
|
||||
path,
|
||||
'GET',
|
||||
queryParams,
|
||||
postBody,
|
||||
headerParams,
|
||||
formParams,
|
||||
contentTypes.isEmpty ? null : contentTypes.first,
|
||||
);
|
||||
}
|
||||
|
||||
Future<List<LibraryResponseDto>?> getAllForUser() async {
|
||||
final response = await getAllForUserWithHttpInfo();
|
||||
if (response.statusCode >= HttpStatus.badRequest) {
|
||||
throw ApiException(response.statusCode, await _decodeBodyBytes(response));
|
||||
}
|
||||
// When a remote server returns no body with a status of 204, we shall not decode it.
|
||||
// At the time of writing this, `dart:convert` will throw an "Unexpected end of input"
|
||||
// FormatException when trying to decode an empty string.
|
||||
if (response.body.isNotEmpty && response.statusCode != HttpStatus.noContent) {
|
||||
final responseBody = await _decodeBodyBytes(response);
|
||||
return (await apiClient.deserializeAsync(responseBody, 'List<LibraryResponseDto>') as List)
|
||||
.cast<LibraryResponseDto>()
|
||||
.toList();
|
||||
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
/// Performs an HTTP 'GET /library/{id}' operation and returns the [Response].
|
||||
/// Parameters:
|
||||
///
|
||||
/// * [String] id (required):
|
||||
Future<Response> getLibraryInfoWithHttpInfo(String id,) async {
|
||||
// ignore: prefer_const_declarations
|
||||
final path = r'/library/{id}'
|
||||
.replaceAll('{id}', id);
|
||||
|
||||
// ignore: prefer_final_locals
|
||||
Object? postBody;
|
||||
|
||||
final queryParams = <QueryParam>[];
|
||||
final headerParams = <String, String>{};
|
||||
final formParams = <String, String>{};
|
||||
|
||||
const contentTypes = <String>[];
|
||||
|
||||
|
||||
return apiClient.invokeAPI(
|
||||
path,
|
||||
'GET',
|
||||
queryParams,
|
||||
postBody,
|
||||
headerParams,
|
||||
formParams,
|
||||
contentTypes.isEmpty ? null : contentTypes.first,
|
||||
);
|
||||
}
|
||||
|
||||
/// Parameters:
|
||||
///
|
||||
/// * [String] id (required):
|
||||
Future<LibraryResponseDto?> getLibraryInfo(String id,) async {
|
||||
final response = await getLibraryInfoWithHttpInfo(id,);
|
||||
if (response.statusCode >= HttpStatus.badRequest) {
|
||||
throw ApiException(response.statusCode, await _decodeBodyBytes(response));
|
||||
}
|
||||
// When a remote server returns no body with a status of 204, we shall not decode it.
|
||||
// At the time of writing this, `dart:convert` will throw an "Unexpected end of input"
|
||||
// FormatException when trying to decode an empty string.
|
||||
if (response.body.isNotEmpty && response.statusCode != HttpStatus.noContent) {
|
||||
return await apiClient.deserializeAsync(await _decodeBodyBytes(response), 'LibraryResponseDto',) as LibraryResponseDto;
|
||||
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
/// Performs an HTTP 'GET /library/{id}/statistics' operation and returns the [Response].
|
||||
/// Parameters:
|
||||
///
|
||||
/// * [String] id (required):
|
||||
Future<Response> getLibraryStatisticsWithHttpInfo(String id,) async {
|
||||
// ignore: prefer_const_declarations
|
||||
final path = r'/library/{id}/statistics'
|
||||
.replaceAll('{id}', id);
|
||||
|
||||
// ignore: prefer_final_locals
|
||||
Object? postBody;
|
||||
|
||||
final queryParams = <QueryParam>[];
|
||||
final headerParams = <String, String>{};
|
||||
final formParams = <String, String>{};
|
||||
|
||||
const contentTypes = <String>[];
|
||||
|
||||
|
||||
return apiClient.invokeAPI(
|
||||
path,
|
||||
'GET',
|
||||
queryParams,
|
||||
postBody,
|
||||
headerParams,
|
||||
formParams,
|
||||
contentTypes.isEmpty ? null : contentTypes.first,
|
||||
);
|
||||
}
|
||||
|
||||
/// Parameters:
|
||||
///
|
||||
/// * [String] id (required):
|
||||
Future<LibraryStatsResponseDto?> getLibraryStatistics(String id,) async {
|
||||
final response = await getLibraryStatisticsWithHttpInfo(id,);
|
||||
if (response.statusCode >= HttpStatus.badRequest) {
|
||||
throw ApiException(response.statusCode, await _decodeBodyBytes(response));
|
||||
}
|
||||
// When a remote server returns no body with a status of 204, we shall not decode it.
|
||||
// At the time of writing this, `dart:convert` will throw an "Unexpected end of input"
|
||||
// FormatException when trying to decode an empty string.
|
||||
if (response.body.isNotEmpty && response.statusCode != HttpStatus.noContent) {
|
||||
return await apiClient.deserializeAsync(await _decodeBodyBytes(response), 'LibraryStatsResponseDto',) as LibraryStatsResponseDto;
|
||||
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
/// Performs an HTTP 'POST /library/{id}/removeOffline' operation and returns the [Response].
|
||||
/// Parameters:
|
||||
///
|
||||
/// * [String] id (required):
|
||||
Future<Response> removeOfflineFilesWithHttpInfo(String id,) async {
|
||||
// ignore: prefer_const_declarations
|
||||
final path = r'/library/{id}/removeOffline'
|
||||
.replaceAll('{id}', id);
|
||||
|
||||
// ignore: prefer_final_locals
|
||||
Object? postBody;
|
||||
|
||||
final queryParams = <QueryParam>[];
|
||||
final headerParams = <String, String>{};
|
||||
final formParams = <String, String>{};
|
||||
|
||||
const contentTypes = <String>[];
|
||||
|
||||
|
||||
return apiClient.invokeAPI(
|
||||
path,
|
||||
'POST',
|
||||
queryParams,
|
||||
postBody,
|
||||
headerParams,
|
||||
formParams,
|
||||
contentTypes.isEmpty ? null : contentTypes.first,
|
||||
);
|
||||
}
|
||||
|
||||
/// Parameters:
|
||||
///
|
||||
/// * [String] id (required):
|
||||
Future<void> removeOfflineFiles(String id,) async {
|
||||
final response = await removeOfflineFilesWithHttpInfo(id,);
|
||||
if (response.statusCode >= HttpStatus.badRequest) {
|
||||
throw ApiException(response.statusCode, await _decodeBodyBytes(response));
|
||||
}
|
||||
}
|
||||
|
||||
/// Performs an HTTP 'POST /library/{id}/scan' operation and returns the [Response].
|
||||
/// Parameters:
|
||||
///
|
||||
/// * [String] id (required):
|
||||
///
|
||||
/// * [ScanLibraryDto] scanLibraryDto (required):
|
||||
Future<Response> scanLibraryWithHttpInfo(String id, ScanLibraryDto scanLibraryDto,) async {
|
||||
// ignore: prefer_const_declarations
|
||||
final path = r'/library/{id}/scan'
|
||||
.replaceAll('{id}', id);
|
||||
|
||||
// ignore: prefer_final_locals
|
||||
Object? postBody = scanLibraryDto;
|
||||
|
||||
final queryParams = <QueryParam>[];
|
||||
final headerParams = <String, String>{};
|
||||
final formParams = <String, String>{};
|
||||
|
||||
const contentTypes = <String>['application/json'];
|
||||
|
||||
|
||||
return apiClient.invokeAPI(
|
||||
path,
|
||||
'POST',
|
||||
queryParams,
|
||||
postBody,
|
||||
headerParams,
|
||||
formParams,
|
||||
contentTypes.isEmpty ? null : contentTypes.first,
|
||||
);
|
||||
}
|
||||
|
||||
/// Parameters:
|
||||
///
|
||||
/// * [String] id (required):
|
||||
///
|
||||
/// * [ScanLibraryDto] scanLibraryDto (required):
|
||||
Future<void> scanLibrary(String id, ScanLibraryDto scanLibraryDto,) async {
|
||||
final response = await scanLibraryWithHttpInfo(id, scanLibraryDto,);
|
||||
if (response.statusCode >= HttpStatus.badRequest) {
|
||||
throw ApiException(response.statusCode, await _decodeBodyBytes(response));
|
||||
}
|
||||
}
|
||||
|
||||
/// Performs an HTTP 'PUT /library/{id}' operation and returns the [Response].
|
||||
/// Parameters:
|
||||
///
|
||||
/// * [String] id (required):
|
||||
///
|
||||
/// * [UpdateLibraryDto] updateLibraryDto (required):
|
||||
Future<Response> updateLibraryWithHttpInfo(String id, UpdateLibraryDto updateLibraryDto,) async {
|
||||
// ignore: prefer_const_declarations
|
||||
final path = r'/library/{id}'
|
||||
.replaceAll('{id}', id);
|
||||
|
||||
// ignore: prefer_final_locals
|
||||
Object? postBody = updateLibraryDto;
|
||||
|
||||
final queryParams = <QueryParam>[];
|
||||
final headerParams = <String, String>{};
|
||||
final formParams = <String, String>{};
|
||||
|
||||
const contentTypes = <String>['application/json'];
|
||||
|
||||
|
||||
return apiClient.invokeAPI(
|
||||
path,
|
||||
'PUT',
|
||||
queryParams,
|
||||
postBody,
|
||||
headerParams,
|
||||
formParams,
|
||||
contentTypes.isEmpty ? null : contentTypes.first,
|
||||
);
|
||||
}
|
||||
|
||||
/// Parameters:
|
||||
///
|
||||
/// * [String] id (required):
|
||||
///
|
||||
/// * [UpdateLibraryDto] updateLibraryDto (required):
|
||||
Future<LibraryResponseDto?> updateLibrary(String id, UpdateLibraryDto updateLibraryDto,) async {
|
||||
final response = await updateLibraryWithHttpInfo(id, updateLibraryDto,);
|
||||
if (response.statusCode >= HttpStatus.badRequest) {
|
||||
throw ApiException(response.statusCode, await _decodeBodyBytes(response));
|
||||
}
|
||||
// When a remote server returns no body with a status of 204, we shall not decode it.
|
||||
// At the time of writing this, `dart:convert` will throw an "Unexpected end of input"
|
||||
// FormatException when trying to decode an empty string.
|
||||
if (response.body.isNotEmpty && response.statusCode != HttpStatus.noContent) {
|
||||
return await apiClient.deserializeAsync(await _decodeBodyBytes(response), 'LibraryResponseDto',) as LibraryResponseDto;
|
||||
|
||||
}
|
||||
return null;
|
||||
}
|
||||
}
|
||||
@ -0,0 +1,150 @@
|
||||
//
|
||||
// AUTO-GENERATED FILE, DO NOT MODIFY!
|
||||
//
|
||||
// @dart=2.12
|
||||
|
||||
// ignore_for_file: unused_element, unused_import
|
||||
// ignore_for_file: always_put_required_named_parameters_first
|
||||
// ignore_for_file: constant_identifier_names
|
||||
// ignore_for_file: lines_longer_than_80_chars
|
||||
|
||||
part of openapi.api;
|
||||
|
||||
class CreateLibraryDto {
|
||||
/// Returns a new [CreateLibraryDto] instance.
|
||||
CreateLibraryDto({
|
||||
this.exclusionPatterns = const [],
|
||||
this.importPaths = const [],
|
||||
this.isVisible,
|
||||
this.name,
|
||||
required this.type,
|
||||
});
|
||||
|
||||
List<String> exclusionPatterns;
|
||||
|
||||
List<String> importPaths;
|
||||
|
||||
///
|
||||
/// Please note: This property should have been non-nullable! Since the specification file
|
||||
/// does not include a default value (using the "default:" property), however, the generated
|
||||
/// source code must fall back to having a nullable type.
|
||||
/// Consider adding a "default:" property in the specification file to hide this note.
|
||||
///
|
||||
bool? isVisible;
|
||||
|
||||
///
|
||||
/// Please note: This property should have been non-nullable! Since the specification file
|
||||
/// does not include a default value (using the "default:" property), however, the generated
|
||||
/// source code must fall back to having a nullable type.
|
||||
/// Consider adding a "default:" property in the specification file to hide this note.
|
||||
///
|
||||
String? name;
|
||||
|
||||
LibraryType type;
|
||||
|
||||
@override
|
||||
bool operator ==(Object other) => identical(this, other) || other is CreateLibraryDto &&
|
||||
other.exclusionPatterns == exclusionPatterns &&
|
||||
other.importPaths == importPaths &&
|
||||
other.isVisible == isVisible &&
|
||||
other.name == name &&
|
||||
other.type == type;
|
||||
|
||||
@override
|
||||
int get hashCode =>
|
||||
// ignore: unnecessary_parenthesis
|
||||
(exclusionPatterns.hashCode) +
|
||||
(importPaths.hashCode) +
|
||||
(isVisible == null ? 0 : isVisible!.hashCode) +
|
||||
(name == null ? 0 : name!.hashCode) +
|
||||
(type.hashCode);
|
||||
|
||||
@override
|
||||
String toString() => 'CreateLibraryDto[exclusionPatterns=$exclusionPatterns, importPaths=$importPaths, isVisible=$isVisible, name=$name, type=$type]';
|
||||
|
||||
Map<String, dynamic> toJson() {
|
||||
final json = <String, dynamic>{};
|
||||
json[r'exclusionPatterns'] = this.exclusionPatterns;
|
||||
json[r'importPaths'] = this.importPaths;
|
||||
if (this.isVisible != null) {
|
||||
json[r'isVisible'] = this.isVisible;
|
||||
} else {
|
||||
// json[r'isVisible'] = null;
|
||||
}
|
||||
if (this.name != null) {
|
||||
json[r'name'] = this.name;
|
||||
} else {
|
||||
// json[r'name'] = null;
|
||||
}
|
||||
json[r'type'] = this.type;
|
||||
return json;
|
||||
}
|
||||
|
||||
/// Returns a new [CreateLibraryDto] instance and imports its values from
|
||||
/// [value] if it's a [Map], null otherwise.
|
||||
// ignore: prefer_constructors_over_static_methods
|
||||
static CreateLibraryDto? fromJson(dynamic value) {
|
||||
if (value is Map) {
|
||||
final json = value.cast<String, dynamic>();
|
||||
|
||||
return CreateLibraryDto(
|
||||
exclusionPatterns: json[r'exclusionPatterns'] is List
|
||||
? (json[r'exclusionPatterns'] as List).cast<String>()
|
||||
: const [],
|
||||
importPaths: json[r'importPaths'] is List
|
||||
? (json[r'importPaths'] as List).cast<String>()
|
||||
: const [],
|
||||
isVisible: mapValueOfType<bool>(json, r'isVisible'),
|
||||
name: mapValueOfType<String>(json, r'name'),
|
||||
type: LibraryType.fromJson(json[r'type'])!,
|
||||
);
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
static List<CreateLibraryDto> listFromJson(dynamic json, {bool growable = false,}) {
|
||||
final result = <CreateLibraryDto>[];
|
||||
if (json is List && json.isNotEmpty) {
|
||||
for (final row in json) {
|
||||
final value = CreateLibraryDto.fromJson(row);
|
||||
if (value != null) {
|
||||
result.add(value);
|
||||
}
|
||||
}
|
||||
}
|
||||
return result.toList(growable: growable);
|
||||
}
|
||||
|
||||
static Map<String, CreateLibraryDto> mapFromJson(dynamic json) {
|
||||
final map = <String, CreateLibraryDto>{};
|
||||
if (json is Map && json.isNotEmpty) {
|
||||
json = json.cast<String, dynamic>(); // ignore: parameter_assignments
|
||||
for (final entry in json.entries) {
|
||||
final value = CreateLibraryDto.fromJson(entry.value);
|
||||
if (value != null) {
|
||||
map[entry.key] = value;
|
||||
}
|
||||
}
|
||||
}
|
||||
return map;
|
||||
}
|
||||
|
||||
// maps a json object with a list of CreateLibraryDto-objects as value to a dart map
|
||||
static Map<String, List<CreateLibraryDto>> mapListFromJson(dynamic json, {bool growable = false,}) {
|
||||
final map = <String, List<CreateLibraryDto>>{};
|
||||
if (json is Map && json.isNotEmpty) {
|
||||
// ignore: parameter_assignments
|
||||
json = json.cast<String, dynamic>();
|
||||
for (final entry in json.entries) {
|
||||
map[entry.key] = CreateLibraryDto.listFromJson(entry.value, growable: growable,);
|
||||
}
|
||||
}
|
||||
return map;
|
||||
}
|
||||
|
||||
/// The list of required keys that must be present in a JSON.
|
||||
static const requiredKeys = <String>{
|
||||
'type',
|
||||
};
|
||||
}
|
||||
|
||||
@ -0,0 +1,178 @@
|
||||
//
|
||||
// AUTO-GENERATED FILE, DO NOT MODIFY!
|
||||
//
|
||||
// @dart=2.12
|
||||
|
||||
// ignore_for_file: unused_element, unused_import
|
||||
// ignore_for_file: always_put_required_named_parameters_first
|
||||
// ignore_for_file: constant_identifier_names
|
||||
// ignore_for_file: lines_longer_than_80_chars
|
||||
|
||||
part of openapi.api;
|
||||
|
||||
class LibraryResponseDto {
|
||||
/// Returns a new [LibraryResponseDto] instance.
|
||||
LibraryResponseDto({
|
||||
required this.assetCount,
|
||||
required this.createdAt,
|
||||
this.exclusionPatterns = const [],
|
||||
required this.id,
|
||||
this.importPaths = const [],
|
||||
required this.name,
|
||||
required this.ownerId,
|
||||
required this.refreshedAt,
|
||||
required this.type,
|
||||
required this.updatedAt,
|
||||
});
|
||||
|
||||
int assetCount;
|
||||
|
||||
DateTime createdAt;
|
||||
|
||||
List<String> exclusionPatterns;
|
||||
|
||||
String id;
|
||||
|
||||
List<String> importPaths;
|
||||
|
||||
String name;
|
||||
|
||||
String ownerId;
|
||||
|
||||
DateTime? refreshedAt;
|
||||
|
||||
LibraryType type;
|
||||
|
||||
DateTime updatedAt;
|
||||
|
||||
@override
|
||||
bool operator ==(Object other) => identical(this, other) || other is LibraryResponseDto &&
|
||||
other.assetCount == assetCount &&
|
||||
other.createdAt == createdAt &&
|
||||
other.exclusionPatterns == exclusionPatterns &&
|
||||
other.id == id &&
|
||||
other.importPaths == importPaths &&
|
||||
other.name == name &&
|
||||
other.ownerId == ownerId &&
|
||||
other.refreshedAt == refreshedAt &&
|
||||
other.type == type &&
|
||||
other.updatedAt == updatedAt;
|
||||
|
||||
@override
|
||||
int get hashCode =>
|
||||
// ignore: unnecessary_parenthesis
|
||||
(assetCount.hashCode) +
|
||||
(createdAt.hashCode) +
|
||||
(exclusionPatterns.hashCode) +
|
||||
(id.hashCode) +
|
||||
(importPaths.hashCode) +
|
||||
(name.hashCode) +
|
||||
(ownerId.hashCode) +
|
||||
(refreshedAt == null ? 0 : refreshedAt!.hashCode) +
|
||||
(type.hashCode) +
|
||||
(updatedAt.hashCode);
|
||||
|
||||
@override
|
||||
String toString() => 'LibraryResponseDto[assetCount=$assetCount, createdAt=$createdAt, exclusionPatterns=$exclusionPatterns, id=$id, importPaths=$importPaths, name=$name, ownerId=$ownerId, refreshedAt=$refreshedAt, type=$type, updatedAt=$updatedAt]';
|
||||
|
||||
Map<String, dynamic> toJson() {
|
||||
final json = <String, dynamic>{};
|
||||
json[r'assetCount'] = this.assetCount;
|
||||
json[r'createdAt'] = this.createdAt.toUtc().toIso8601String();
|
||||
json[r'exclusionPatterns'] = this.exclusionPatterns;
|
||||
json[r'id'] = this.id;
|
||||
json[r'importPaths'] = this.importPaths;
|
||||
json[r'name'] = this.name;
|
||||
json[r'ownerId'] = this.ownerId;
|
||||
if (this.refreshedAt != null) {
|
||||
json[r'refreshedAt'] = this.refreshedAt!.toUtc().toIso8601String();
|
||||
} else {
|
||||
// json[r'refreshedAt'] = null;
|
||||
}
|
||||
json[r'type'] = this.type;
|
||||
json[r'updatedAt'] = this.updatedAt.toUtc().toIso8601String();
|
||||
return json;
|
||||
}
|
||||
|
||||
/// Returns a new [LibraryResponseDto] instance and imports its values from
|
||||
/// [value] if it's a [Map], null otherwise.
|
||||
// ignore: prefer_constructors_over_static_methods
|
||||
static LibraryResponseDto? fromJson(dynamic value) {
|
||||
if (value is Map) {
|
||||
final json = value.cast<String, dynamic>();
|
||||
|
||||
return LibraryResponseDto(
|
||||
assetCount: mapValueOfType<int>(json, r'assetCount')!,
|
||||
createdAt: mapDateTime(json, r'createdAt', '')!,
|
||||
exclusionPatterns: json[r'exclusionPatterns'] is List
|
||||
? (json[r'exclusionPatterns'] as List).cast<String>()
|
||||
: const [],
|
||||
id: mapValueOfType<String>(json, r'id')!,
|
||||
importPaths: json[r'importPaths'] is List
|
||||
? (json[r'importPaths'] as List).cast<String>()
|
||||
: const [],
|
||||
name: mapValueOfType<String>(json, r'name')!,
|
||||
ownerId: mapValueOfType<String>(json, r'ownerId')!,
|
||||
refreshedAt: mapDateTime(json, r'refreshedAt', ''),
|
||||
type: LibraryType.fromJson(json[r'type'])!,
|
||||
updatedAt: mapDateTime(json, r'updatedAt', '')!,
|
||||
);
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
static List<LibraryResponseDto> listFromJson(dynamic json, {bool growable = false,}) {
|
||||
final result = <LibraryResponseDto>[];
|
||||
if (json is List && json.isNotEmpty) {
|
||||
for (final row in json) {
|
||||
final value = LibraryResponseDto.fromJson(row);
|
||||
if (value != null) {
|
||||
result.add(value);
|
||||
}
|
||||
}
|
||||
}
|
||||
return result.toList(growable: growable);
|
||||
}
|
||||
|
||||
static Map<String, LibraryResponseDto> mapFromJson(dynamic json) {
|
||||
final map = <String, LibraryResponseDto>{};
|
||||
if (json is Map && json.isNotEmpty) {
|
||||
json = json.cast<String, dynamic>(); // ignore: parameter_assignments
|
||||
for (final entry in json.entries) {
|
||||
final value = LibraryResponseDto.fromJson(entry.value);
|
||||
if (value != null) {
|
||||
map[entry.key] = value;
|
||||
}
|
||||
}
|
||||
}
|
||||
return map;
|
||||
}
|
||||
|
||||
// maps a json object with a list of LibraryResponseDto-objects as value to a dart map
|
||||
static Map<String, List<LibraryResponseDto>> mapListFromJson(dynamic json, {bool growable = false,}) {
|
||||
final map = <String, List<LibraryResponseDto>>{};
|
||||
if (json is Map && json.isNotEmpty) {
|
||||
// ignore: parameter_assignments
|
||||
json = json.cast<String, dynamic>();
|
||||
for (final entry in json.entries) {
|
||||
map[entry.key] = LibraryResponseDto.listFromJson(entry.value, growable: growable,);
|
||||
}
|
||||
}
|
||||
return map;
|
||||
}
|
||||
|
||||
/// The list of required keys that must be present in a JSON.
|
||||
static const requiredKeys = <String>{
|
||||
'assetCount',
|
||||
'createdAt',
|
||||
'exclusionPatterns',
|
||||
'id',
|
||||
'importPaths',
|
||||
'name',
|
||||
'ownerId',
|
||||
'refreshedAt',
|
||||
'type',
|
||||
'updatedAt',
|
||||
};
|
||||
}
|
||||
|
||||
@ -0,0 +1,122 @@
|
||||
//
|
||||
// AUTO-GENERATED FILE, DO NOT MODIFY!
|
||||
//
|
||||
// @dart=2.12
|
||||
|
||||
// ignore_for_file: unused_element, unused_import
|
||||
// ignore_for_file: always_put_required_named_parameters_first
|
||||
// ignore_for_file: constant_identifier_names
|
||||
// ignore_for_file: lines_longer_than_80_chars
|
||||
|
||||
part of openapi.api;
|
||||
|
||||
class LibraryStatsResponseDto {
|
||||
/// Returns a new [LibraryStatsResponseDto] instance.
|
||||
LibraryStatsResponseDto({
|
||||
this.photos = 0,
|
||||
this.total = 0,
|
||||
this.usage = 0,
|
||||
this.videos = 0,
|
||||
});
|
||||
|
||||
int photos;
|
||||
|
||||
int total;
|
||||
|
||||
int usage;
|
||||
|
||||
int videos;
|
||||
|
||||
@override
|
||||
bool operator ==(Object other) => identical(this, other) || other is LibraryStatsResponseDto &&
|
||||
other.photos == photos &&
|
||||
other.total == total &&
|
||||
other.usage == usage &&
|
||||
other.videos == videos;
|
||||
|
||||
@override
|
||||
int get hashCode =>
|
||||
// ignore: unnecessary_parenthesis
|
||||
(photos.hashCode) +
|
||||
(total.hashCode) +
|
||||
(usage.hashCode) +
|
||||
(videos.hashCode);
|
||||
|
||||
@override
|
||||
String toString() => 'LibraryStatsResponseDto[photos=$photos, total=$total, usage=$usage, videos=$videos]';
|
||||
|
||||
Map<String, dynamic> toJson() {
|
||||
final json = <String, dynamic>{};
|
||||
json[r'photos'] = this.photos;
|
||||
json[r'total'] = this.total;
|
||||
json[r'usage'] = this.usage;
|
||||
json[r'videos'] = this.videos;
|
||||
return json;
|
||||
}
|
||||
|
||||
/// Returns a new [LibraryStatsResponseDto] instance and imports its values from
|
||||
/// [value] if it's a [Map], null otherwise.
|
||||
// ignore: prefer_constructors_over_static_methods
|
||||
static LibraryStatsResponseDto? fromJson(dynamic value) {
|
||||
if (value is Map) {
|
||||
final json = value.cast<String, dynamic>();
|
||||
|
||||
return LibraryStatsResponseDto(
|
||||
photos: mapValueOfType<int>(json, r'photos')!,
|
||||
total: mapValueOfType<int>(json, r'total')!,
|
||||
usage: mapValueOfType<int>(json, r'usage')!,
|
||||
videos: mapValueOfType<int>(json, r'videos')!,
|
||||
);
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
static List<LibraryStatsResponseDto> listFromJson(dynamic json, {bool growable = false,}) {
|
||||
final result = <LibraryStatsResponseDto>[];
|
||||
if (json is List && json.isNotEmpty) {
|
||||
for (final row in json) {
|
||||
final value = LibraryStatsResponseDto.fromJson(row);
|
||||
if (value != null) {
|
||||
result.add(value);
|
||||
}
|
||||
}
|
||||
}
|
||||
return result.toList(growable: growable);
|
||||
}
|
||||
|
||||
static Map<String, LibraryStatsResponseDto> mapFromJson(dynamic json) {
|
||||
final map = <String, LibraryStatsResponseDto>{};
|
||||
if (json is Map && json.isNotEmpty) {
|
||||
json = json.cast<String, dynamic>(); // ignore: parameter_assignments
|
||||
for (final entry in json.entries) {
|
||||
final value = LibraryStatsResponseDto.fromJson(entry.value);
|
||||
if (value != null) {
|
||||
map[entry.key] = value;
|
||||
}
|
||||
}
|
||||
}
|
||||
return map;
|
||||
}
|
||||
|
||||
// maps a json object with a list of LibraryStatsResponseDto-objects as value to a dart map
|
||||
static Map<String, List<LibraryStatsResponseDto>> mapListFromJson(dynamic json, {bool growable = false,}) {
|
||||
final map = <String, List<LibraryStatsResponseDto>>{};
|
||||
if (json is Map && json.isNotEmpty) {
|
||||
// ignore: parameter_assignments
|
||||
json = json.cast<String, dynamic>();
|
||||
for (final entry in json.entries) {
|
||||
map[entry.key] = LibraryStatsResponseDto.listFromJson(entry.value, growable: growable,);
|
||||
}
|
||||
}
|
||||
return map;
|
||||
}
|
||||
|
||||
/// The list of required keys that must be present in a JSON.
|
||||
static const requiredKeys = <String>{
|
||||
'photos',
|
||||
'total',
|
||||
'usage',
|
||||
'videos',
|
||||
};
|
||||
}
|
||||
|
||||
@ -0,0 +1,85 @@
|
||||
//
|
||||
// AUTO-GENERATED FILE, DO NOT MODIFY!
|
||||
//
|
||||
// @dart=2.12
|
||||
|
||||
// ignore_for_file: unused_element, unused_import
|
||||
// ignore_for_file: always_put_required_named_parameters_first
|
||||
// ignore_for_file: constant_identifier_names
|
||||
// ignore_for_file: lines_longer_than_80_chars
|
||||
|
||||
part of openapi.api;
|
||||
|
||||
|
||||
class LibraryType {
|
||||
/// Instantiate a new enum with the provided [value].
|
||||
const LibraryType._(this.value);
|
||||
|
||||
/// The underlying value of this enum member.
|
||||
final String value;
|
||||
|
||||
@override
|
||||
String toString() => value;
|
||||
|
||||
String toJson() => value;
|
||||
|
||||
static const UPLOAD = LibraryType._(r'UPLOAD');
|
||||
static const EXTERNAL = LibraryType._(r'EXTERNAL');
|
||||
|
||||
/// List of all possible values in this [enum][LibraryType].
|
||||
static const values = <LibraryType>[
|
||||
UPLOAD,
|
||||
EXTERNAL,
|
||||
];
|
||||
|
||||
static LibraryType? fromJson(dynamic value) => LibraryTypeTypeTransformer().decode(value);
|
||||
|
||||
static List<LibraryType>? listFromJson(dynamic json, {bool growable = false,}) {
|
||||
final result = <LibraryType>[];
|
||||
if (json is List && json.isNotEmpty) {
|
||||
for (final row in json) {
|
||||
final value = LibraryType.fromJson(row);
|
||||
if (value != null) {
|
||||
result.add(value);
|
||||
}
|
||||
}
|
||||
}
|
||||
return result.toList(growable: growable);
|
||||
}
|
||||
}
|
||||
|
||||
/// Transformation class that can [encode] an instance of [LibraryType] to String,
|
||||
/// and [decode] dynamic data back to [LibraryType].
|
||||
class LibraryTypeTypeTransformer {
|
||||
factory LibraryTypeTypeTransformer() => _instance ??= const LibraryTypeTypeTransformer._();
|
||||
|
||||
const LibraryTypeTypeTransformer._();
|
||||
|
||||
String encode(LibraryType data) => data.value;
|
||||
|
||||
/// Decodes a [dynamic value][data] to a LibraryType.
|
||||
///
|
||||
/// If [allowNull] is true and the [dynamic value][data] cannot be decoded successfully,
|
||||
/// then null is returned. However, if [allowNull] is false and the [dynamic value][data]
|
||||
/// cannot be decoded successfully, then an [UnimplementedError] is thrown.
|
||||
///
|
||||
/// The [allowNull] is very handy when an API changes and a new enum value is added or removed,
|
||||
/// and users are still using an old app with the old code.
|
||||
LibraryType? decode(dynamic data, {bool allowNull = true}) {
|
||||
if (data != null) {
|
||||
switch (data) {
|
||||
case r'UPLOAD': return LibraryType.UPLOAD;
|
||||
case r'EXTERNAL': return LibraryType.EXTERNAL;
|
||||
default:
|
||||
if (!allowNull) {
|
||||
throw ArgumentError('Unknown enum value to decode: $data');
|
||||
}
|
||||
}
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
/// Singleton [LibraryTypeTypeTransformer] instance.
|
||||
static LibraryTypeTypeTransformer? _instance;
|
||||
}
|
||||
|
||||
@ -0,0 +1,114 @@
|
||||
//
|
||||
// AUTO-GENERATED FILE, DO NOT MODIFY!
|
||||
//
|
||||
// @dart=2.12
|
||||
|
||||
// ignore_for_file: unused_element, unused_import
|
||||
// ignore_for_file: always_put_required_named_parameters_first
|
||||
// ignore_for_file: constant_identifier_names
|
||||
// ignore_for_file: lines_longer_than_80_chars
|
||||
|
||||
part of openapi.api;
|
||||
|
||||
class ScanLibraryDto {
|
||||
/// Returns a new [ScanLibraryDto] instance.
|
||||
ScanLibraryDto({
|
||||
this.refreshAllFiles = false,
|
||||
this.refreshModifiedFiles,
|
||||
});
|
||||
|
||||
bool refreshAllFiles;
|
||||
|
||||
///
|
||||
/// Please note: This property should have been non-nullable! Since the specification file
|
||||
/// does not include a default value (using the "default:" property), however, the generated
|
||||
/// source code must fall back to having a nullable type.
|
||||
/// Consider adding a "default:" property in the specification file to hide this note.
|
||||
///
|
||||
bool? refreshModifiedFiles;
|
||||
|
||||
@override
|
||||
bool operator ==(Object other) => identical(this, other) || other is ScanLibraryDto &&
|
||||
other.refreshAllFiles == refreshAllFiles &&
|
||||
other.refreshModifiedFiles == refreshModifiedFiles;
|
||||
|
||||
@override
|
||||
int get hashCode =>
|
||||
// ignore: unnecessary_parenthesis
|
||||
(refreshAllFiles.hashCode) +
|
||||
(refreshModifiedFiles == null ? 0 : refreshModifiedFiles!.hashCode);
|
||||
|
||||
@override
|
||||
String toString() => 'ScanLibraryDto[refreshAllFiles=$refreshAllFiles, refreshModifiedFiles=$refreshModifiedFiles]';
|
||||
|
||||
Map<String, dynamic> toJson() {
|
||||
final json = <String, dynamic>{};
|
||||
json[r'refreshAllFiles'] = this.refreshAllFiles;
|
||||
if (this.refreshModifiedFiles != null) {
|
||||
json[r'refreshModifiedFiles'] = this.refreshModifiedFiles;
|
||||
} else {
|
||||
// json[r'refreshModifiedFiles'] = null;
|
||||
}
|
||||
return json;
|
||||
}
|
||||
|
||||
/// Returns a new [ScanLibraryDto] instance and imports its values from
|
||||
/// [value] if it's a [Map], null otherwise.
|
||||
// ignore: prefer_constructors_over_static_methods
|
||||
static ScanLibraryDto? fromJson(dynamic value) {
|
||||
if (value is Map) {
|
||||
final json = value.cast<String, dynamic>();
|
||||
|
||||
return ScanLibraryDto(
|
||||
refreshAllFiles: mapValueOfType<bool>(json, r'refreshAllFiles') ?? false,
|
||||
refreshModifiedFiles: mapValueOfType<bool>(json, r'refreshModifiedFiles'),
|
||||
);
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
static List<ScanLibraryDto> listFromJson(dynamic json, {bool growable = false,}) {
|
||||
final result = <ScanLibraryDto>[];
|
||||
if (json is List && json.isNotEmpty) {
|
||||
for (final row in json) {
|
||||
final value = ScanLibraryDto.fromJson(row);
|
||||
if (value != null) {
|
||||
result.add(value);
|
||||
}
|
||||
}
|
||||
}
|
||||
return result.toList(growable: growable);
|
||||
}
|
||||
|
||||
static Map<String, ScanLibraryDto> mapFromJson(dynamic json) {
|
||||
final map = <String, ScanLibraryDto>{};
|
||||
if (json is Map && json.isNotEmpty) {
|
||||
json = json.cast<String, dynamic>(); // ignore: parameter_assignments
|
||||
for (final entry in json.entries) {
|
||||
final value = ScanLibraryDto.fromJson(entry.value);
|
||||
if (value != null) {
|
||||
map[entry.key] = value;
|
||||
}
|
||||
}
|
||||
}
|
||||
return map;
|
||||
}
|
||||
|
||||
// maps a json object with a list of ScanLibraryDto-objects as value to a dart map
|
||||
static Map<String, List<ScanLibraryDto>> mapListFromJson(dynamic json, {bool growable = false,}) {
|
||||
final map = <String, List<ScanLibraryDto>>{};
|
||||
if (json is Map && json.isNotEmpty) {
|
||||
// ignore: parameter_assignments
|
||||
json = json.cast<String, dynamic>();
|
||||
for (final entry in json.entries) {
|
||||
map[entry.key] = ScanLibraryDto.listFromJson(entry.value, growable: growable,);
|
||||
}
|
||||
}
|
||||
return map;
|
||||
}
|
||||
|
||||
/// The list of required keys that must be present in a JSON.
|
||||
static const requiredKeys = <String>{
|
||||
};
|
||||
}
|
||||
|
||||
@ -0,0 +1,142 @@
|
||||
//
|
||||
// AUTO-GENERATED FILE, DO NOT MODIFY!
|
||||
//
|
||||
// @dart=2.12
|
||||
|
||||
// ignore_for_file: unused_element, unused_import
|
||||
// ignore_for_file: always_put_required_named_parameters_first
|
||||
// ignore_for_file: constant_identifier_names
|
||||
// ignore_for_file: lines_longer_than_80_chars
|
||||
|
||||
part of openapi.api;
|
||||
|
||||
class UpdateLibraryDto {
|
||||
/// Returns a new [UpdateLibraryDto] instance.
|
||||
UpdateLibraryDto({
|
||||
this.exclusionPatterns = const [],
|
||||
this.importPaths = const [],
|
||||
this.isVisible,
|
||||
this.name,
|
||||
});
|
||||
|
||||
List<String> exclusionPatterns;
|
||||
|
||||
List<String> importPaths;
|
||||
|
||||
///
|
||||
/// Please note: This property should have been non-nullable! Since the specification file
|
||||
/// does not include a default value (using the "default:" property), however, the generated
|
||||
/// source code must fall back to having a nullable type.
|
||||
/// Consider adding a "default:" property in the specification file to hide this note.
|
||||
///
|
||||
bool? isVisible;
|
||||
|
||||
///
|
||||
/// Please note: This property should have been non-nullable! Since the specification file
|
||||
/// does not include a default value (using the "default:" property), however, the generated
|
||||
/// source code must fall back to having a nullable type.
|
||||
/// Consider adding a "default:" property in the specification file to hide this note.
|
||||
///
|
||||
String? name;
|
||||
|
||||
@override
|
||||
bool operator ==(Object other) => identical(this, other) || other is UpdateLibraryDto &&
|
||||
other.exclusionPatterns == exclusionPatterns &&
|
||||
other.importPaths == importPaths &&
|
||||
other.isVisible == isVisible &&
|
||||
other.name == name;
|
||||
|
||||
@override
|
||||
int get hashCode =>
|
||||
// ignore: unnecessary_parenthesis
|
||||
(exclusionPatterns.hashCode) +
|
||||
(importPaths.hashCode) +
|
||||
(isVisible == null ? 0 : isVisible!.hashCode) +
|
||||
(name == null ? 0 : name!.hashCode);
|
||||
|
||||
@override
|
||||
String toString() => 'UpdateLibraryDto[exclusionPatterns=$exclusionPatterns, importPaths=$importPaths, isVisible=$isVisible, name=$name]';
|
||||
|
||||
Map<String, dynamic> toJson() {
|
||||
final json = <String, dynamic>{};
|
||||
json[r'exclusionPatterns'] = this.exclusionPatterns;
|
||||
json[r'importPaths'] = this.importPaths;
|
||||
if (this.isVisible != null) {
|
||||
json[r'isVisible'] = this.isVisible;
|
||||
} else {
|
||||
// json[r'isVisible'] = null;
|
||||
}
|
||||
if (this.name != null) {
|
||||
json[r'name'] = this.name;
|
||||
} else {
|
||||
// json[r'name'] = null;
|
||||
}
|
||||
return json;
|
||||
}
|
||||
|
||||
/// Returns a new [UpdateLibraryDto] instance and imports its values from
|
||||
/// [value] if it's a [Map], null otherwise.
|
||||
// ignore: prefer_constructors_over_static_methods
|
||||
static UpdateLibraryDto? fromJson(dynamic value) {
|
||||
if (value is Map) {
|
||||
final json = value.cast<String, dynamic>();
|
||||
|
||||
return UpdateLibraryDto(
|
||||
exclusionPatterns: json[r'exclusionPatterns'] is List
|
||||
? (json[r'exclusionPatterns'] as List).cast<String>()
|
||||
: const [],
|
||||
importPaths: json[r'importPaths'] is List
|
||||
? (json[r'importPaths'] as List).cast<String>()
|
||||
: const [],
|
||||
isVisible: mapValueOfType<bool>(json, r'isVisible'),
|
||||
name: mapValueOfType<String>(json, r'name'),
|
||||
);
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
static List<UpdateLibraryDto> listFromJson(dynamic json, {bool growable = false,}) {
|
||||
final result = <UpdateLibraryDto>[];
|
||||
if (json is List && json.isNotEmpty) {
|
||||
for (final row in json) {
|
||||
final value = UpdateLibraryDto.fromJson(row);
|
||||
if (value != null) {
|
||||
result.add(value);
|
||||
}
|
||||
}
|
||||
}
|
||||
return result.toList(growable: growable);
|
||||
}
|
||||
|
||||
static Map<String, UpdateLibraryDto> mapFromJson(dynamic json) {
|
||||
final map = <String, UpdateLibraryDto>{};
|
||||
if (json is Map && json.isNotEmpty) {
|
||||
json = json.cast<String, dynamic>(); // ignore: parameter_assignments
|
||||
for (final entry in json.entries) {
|
||||
final value = UpdateLibraryDto.fromJson(entry.value);
|
||||
if (value != null) {
|
||||
map[entry.key] = value;
|
||||
}
|
||||
}
|
||||
}
|
||||
return map;
|
||||
}
|
||||
|
||||
// maps a json object with a list of UpdateLibraryDto-objects as value to a dart map
|
||||
static Map<String, List<UpdateLibraryDto>> mapListFromJson(dynamic json, {bool growable = false,}) {
|
||||
final map = <String, List<UpdateLibraryDto>>{};
|
||||
if (json is Map && json.isNotEmpty) {
|
||||
// ignore: parameter_assignments
|
||||
json = json.cast<String, dynamic>();
|
||||
for (final entry in json.entries) {
|
||||
map[entry.key] = UpdateLibraryDto.listFromJson(entry.value, growable: growable,);
|
||||
}
|
||||
}
|
||||
return map;
|
||||
}
|
||||
|
||||
/// The list of required keys that must be present in a JSON.
|
||||
static const requiredKeys = <String>{
|
||||
};
|
||||
}
|
||||
|
||||
@ -0,0 +1,47 @@
|
||||
//
|
||||
// AUTO-GENERATED FILE, DO NOT MODIFY!
|
||||
//
|
||||
// @dart=2.12
|
||||
|
||||
// ignore_for_file: unused_element, unused_import
|
||||
// ignore_for_file: always_put_required_named_parameters_first
|
||||
// ignore_for_file: constant_identifier_names
|
||||
// ignore_for_file: lines_longer_than_80_chars
|
||||
|
||||
import 'package:openapi/api.dart';
|
||||
import 'package:test/test.dart';
|
||||
|
||||
// tests for CreateLibraryDto
|
||||
void main() {
|
||||
// final instance = CreateLibraryDto();
|
||||
|
||||
group('test CreateLibraryDto', () {
|
||||
// List<String> exclusionPatterns (default value: const [])
|
||||
test('to test the property `exclusionPatterns`', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
// List<String> importPaths (default value: const [])
|
||||
test('to test the property `importPaths`', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
// bool isVisible
|
||||
test('to test the property `isVisible`', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
// String name
|
||||
test('to test the property `name`', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
// LibraryType type
|
||||
test('to test the property `type`', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
|
||||
});
|
||||
|
||||
}
|
||||
@ -0,0 +1,61 @@
|
||||
//
|
||||
// AUTO-GENERATED FILE, DO NOT MODIFY!
|
||||
//
|
||||
// @dart=2.12
|
||||
|
||||
// ignore_for_file: unused_element, unused_import
|
||||
// ignore_for_file: always_put_required_named_parameters_first
|
||||
// ignore_for_file: constant_identifier_names
|
||||
// ignore_for_file: lines_longer_than_80_chars
|
||||
|
||||
import 'package:openapi/api.dart';
|
||||
import 'package:test/test.dart';
|
||||
|
||||
|
||||
/// tests for LibraryApi
|
||||
void main() {
|
||||
// final instance = LibraryApi();
|
||||
|
||||
group('tests for LibraryApi', () {
|
||||
//Future<LibraryResponseDto> createLibrary(CreateLibraryDto createLibraryDto) async
|
||||
test('test createLibrary', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
//Future deleteLibrary(String id) async
|
||||
test('test deleteLibrary', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
//Future<List<LibraryResponseDto>> getAllForUser() async
|
||||
test('test getAllForUser', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
//Future<LibraryResponseDto> getLibraryInfo(String id) async
|
||||
test('test getLibraryInfo', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
//Future<LibraryStatsResponseDto> getLibraryStatistics(String id) async
|
||||
test('test getLibraryStatistics', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
//Future removeOfflineFiles(String id) async
|
||||
test('test removeOfflineFiles', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
//Future scanLibrary(String id, ScanLibraryDto scanLibraryDto) async
|
||||
test('test scanLibrary', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
//Future<LibraryResponseDto> updateLibrary(String id, UpdateLibraryDto updateLibraryDto) async
|
||||
test('test updateLibrary', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
});
|
||||
}
|
||||
@ -0,0 +1,72 @@
|
||||
//
|
||||
// AUTO-GENERATED FILE, DO NOT MODIFY!
|
||||
//
|
||||
// @dart=2.12
|
||||
|
||||
// ignore_for_file: unused_element, unused_import
|
||||
// ignore_for_file: always_put_required_named_parameters_first
|
||||
// ignore_for_file: constant_identifier_names
|
||||
// ignore_for_file: lines_longer_than_80_chars
|
||||
|
||||
import 'package:openapi/api.dart';
|
||||
import 'package:test/test.dart';
|
||||
|
||||
// tests for LibraryResponseDto
|
||||
void main() {
|
||||
// final instance = LibraryResponseDto();
|
||||
|
||||
group('test LibraryResponseDto', () {
|
||||
// int assetCount
|
||||
test('to test the property `assetCount`', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
// DateTime createdAt
|
||||
test('to test the property `createdAt`', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
// List<String> exclusionPatterns (default value: const [])
|
||||
test('to test the property `exclusionPatterns`', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
// String id
|
||||
test('to test the property `id`', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
// List<String> importPaths (default value: const [])
|
||||
test('to test the property `importPaths`', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
// String name
|
||||
test('to test the property `name`', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
// String ownerId
|
||||
test('to test the property `ownerId`', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
// DateTime refreshedAt
|
||||
test('to test the property `refreshedAt`', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
// LibraryType type
|
||||
test('to test the property `type`', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
// DateTime updatedAt
|
||||
test('to test the property `updatedAt`', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
|
||||
});
|
||||
|
||||
}
|
||||
@ -0,0 +1,42 @@
|
||||
//
|
||||
// AUTO-GENERATED FILE, DO NOT MODIFY!
|
||||
//
|
||||
// @dart=2.12
|
||||
|
||||
// ignore_for_file: unused_element, unused_import
|
||||
// ignore_for_file: always_put_required_named_parameters_first
|
||||
// ignore_for_file: constant_identifier_names
|
||||
// ignore_for_file: lines_longer_than_80_chars
|
||||
|
||||
import 'package:openapi/api.dart';
|
||||
import 'package:test/test.dart';
|
||||
|
||||
// tests for LibraryStatsResponseDto
|
||||
void main() {
|
||||
// final instance = LibraryStatsResponseDto();
|
||||
|
||||
group('test LibraryStatsResponseDto', () {
|
||||
// int photos (default value: 0)
|
||||
test('to test the property `photos`', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
// int total (default value: 0)
|
||||
test('to test the property `total`', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
// int usage (default value: 0)
|
||||
test('to test the property `usage`', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
// int videos (default value: 0)
|
||||
test('to test the property `videos`', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
|
||||
});
|
||||
|
||||
}
|
||||
@ -0,0 +1,21 @@
|
||||
//
|
||||
// AUTO-GENERATED FILE, DO NOT MODIFY!
|
||||
//
|
||||
// @dart=2.12
|
||||
|
||||
// ignore_for_file: unused_element, unused_import
|
||||
// ignore_for_file: always_put_required_named_parameters_first
|
||||
// ignore_for_file: constant_identifier_names
|
||||
// ignore_for_file: lines_longer_than_80_chars
|
||||
|
||||
import 'package:openapi/api.dart';
|
||||
import 'package:test/test.dart';
|
||||
|
||||
// tests for LibraryType
|
||||
void main() {
|
||||
|
||||
group('test LibraryType', () {
|
||||
|
||||
});
|
||||
|
||||
}
|
||||
@ -0,0 +1,32 @@
|
||||
//
|
||||
// AUTO-GENERATED FILE, DO NOT MODIFY!
|
||||
//
|
||||
// @dart=2.12
|
||||
|
||||
// ignore_for_file: unused_element, unused_import
|
||||
// ignore_for_file: always_put_required_named_parameters_first
|
||||
// ignore_for_file: constant_identifier_names
|
||||
// ignore_for_file: lines_longer_than_80_chars
|
||||
|
||||
import 'package:openapi/api.dart';
|
||||
import 'package:test/test.dart';
|
||||
|
||||
// tests for ScanLibraryDto
|
||||
void main() {
|
||||
// final instance = ScanLibraryDto();
|
||||
|
||||
group('test ScanLibraryDto', () {
|
||||
// bool refreshAllFiles (default value: false)
|
||||
test('to test the property `refreshAllFiles`', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
// bool refreshModifiedFiles
|
||||
test('to test the property `refreshModifiedFiles`', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
|
||||
});
|
||||
|
||||
}
|
||||
@ -0,0 +1,42 @@
|
||||
//
|
||||
// AUTO-GENERATED FILE, DO NOT MODIFY!
|
||||
//
|
||||
// @dart=2.12
|
||||
|
||||
// ignore_for_file: unused_element, unused_import
|
||||
// ignore_for_file: always_put_required_named_parameters_first
|
||||
// ignore_for_file: constant_identifier_names
|
||||
// ignore_for_file: lines_longer_than_80_chars
|
||||
|
||||
import 'package:openapi/api.dart';
|
||||
import 'package:test/test.dart';
|
||||
|
||||
// tests for UpdateLibraryDto
|
||||
void main() {
|
||||
// final instance = UpdateLibraryDto();
|
||||
|
||||
group('test UpdateLibraryDto', () {
|
||||
// List<String> exclusionPatterns (default value: const [])
|
||||
test('to test the property `exclusionPatterns`', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
// List<String> importPaths (default value: const [])
|
||||
test('to test the property `importPaths`', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
// bool isVisible
|
||||
test('to test the property `isVisible`', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
// String name
|
||||
test('to test the property `name`', () async {
|
||||
// TODO
|
||||
});
|
||||
|
||||
|
||||
});
|
||||
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
@ -0,0 +1,3 @@
|
||||
export * from './library.dto';
|
||||
export * from './library.repository';
|
||||
export * from './library.service';
|
||||
@ -0,0 +1,124 @@
|
||||
import { LibraryEntity, LibraryType } from '@app/infra/entities';
|
||||
import { ApiProperty } from '@nestjs/swagger';
|
||||
import { IsBoolean, IsEnum, IsNotEmpty, IsOptional, IsString } from 'class-validator';
|
||||
import { ValidateUUID } from '../domain.util';
|
||||
|
||||
export class CreateLibraryDto {
|
||||
@IsEnum(LibraryType)
|
||||
@ApiProperty({ enumName: 'LibraryType', enum: LibraryType })
|
||||
type!: LibraryType;
|
||||
|
||||
@IsString()
|
||||
@IsOptional()
|
||||
@IsNotEmpty()
|
||||
name?: string;
|
||||
|
||||
@IsOptional()
|
||||
@IsBoolean()
|
||||
isVisible?: boolean;
|
||||
|
||||
@IsOptional()
|
||||
@IsString({ each: true })
|
||||
@IsNotEmpty({ each: true })
|
||||
importPaths?: string[];
|
||||
|
||||
@IsOptional()
|
||||
@IsString({ each: true })
|
||||
@IsNotEmpty({ each: true })
|
||||
exclusionPatterns?: string[];
|
||||
}
|
||||
|
||||
export class UpdateLibraryDto {
|
||||
@IsOptional()
|
||||
@IsString()
|
||||
@IsNotEmpty()
|
||||
name?: string;
|
||||
|
||||
@IsOptional()
|
||||
@IsBoolean()
|
||||
isVisible?: boolean;
|
||||
|
||||
@IsOptional()
|
||||
@IsString({ each: true })
|
||||
@IsNotEmpty({ each: true })
|
||||
importPaths?: string[];
|
||||
|
||||
@IsOptional()
|
||||
@IsNotEmpty({ each: true })
|
||||
@IsString({ each: true })
|
||||
exclusionPatterns?: string[];
|
||||
}
|
||||
|
||||
export class CrawlOptionsDto {
|
||||
pathsToCrawl!: string[];
|
||||
includeHidden? = false;
|
||||
exclusionPatterns?: string[];
|
||||
}
|
||||
|
||||
export class LibrarySearchDto {
|
||||
@ValidateUUID({ optional: true })
|
||||
userId?: string;
|
||||
}
|
||||
|
||||
export class ScanLibraryDto {
|
||||
@IsBoolean()
|
||||
@IsOptional()
|
||||
refreshModifiedFiles?: boolean;
|
||||
|
||||
@IsBoolean()
|
||||
@IsOptional()
|
||||
refreshAllFiles?: boolean = false;
|
||||
}
|
||||
|
||||
export class LibraryResponseDto {
|
||||
id!: string;
|
||||
ownerId!: string;
|
||||
name!: string;
|
||||
|
||||
@ApiProperty({ enumName: 'LibraryType', enum: LibraryType })
|
||||
type!: LibraryType;
|
||||
|
||||
@ApiProperty({ type: 'integer' })
|
||||
assetCount!: number;
|
||||
|
||||
importPaths!: string[];
|
||||
|
||||
exclusionPatterns!: string[];
|
||||
|
||||
createdAt!: Date;
|
||||
updatedAt!: Date;
|
||||
refreshedAt!: Date | null;
|
||||
}
|
||||
|
||||
export class LibraryStatsResponseDto {
|
||||
@ApiProperty({ type: 'integer' })
|
||||
photos = 0;
|
||||
|
||||
@ApiProperty({ type: 'integer' })
|
||||
videos = 0;
|
||||
|
||||
@ApiProperty({ type: 'integer' })
|
||||
total = 0;
|
||||
|
||||
@ApiProperty({ type: 'integer', format: 'int64' })
|
||||
usage = 0;
|
||||
}
|
||||
|
||||
export function mapLibrary(entity: LibraryEntity): LibraryResponseDto {
|
||||
let assetCount = 0;
|
||||
if (entity.assets) {
|
||||
assetCount = entity.assets.length;
|
||||
}
|
||||
return {
|
||||
id: entity.id,
|
||||
ownerId: entity.ownerId,
|
||||
type: entity.type,
|
||||
name: entity.name,
|
||||
createdAt: entity.createdAt,
|
||||
updatedAt: entity.updatedAt,
|
||||
refreshedAt: entity.refreshedAt,
|
||||
assetCount,
|
||||
importPaths: entity.importPaths,
|
||||
exclusionPatterns: entity.exclusionPatterns,
|
||||
};
|
||||
}
|
||||
@ -0,0 +1,22 @@
|
||||
import { LibraryEntity, LibraryType } from '@app/infra/entities';
|
||||
import { LibraryStatsResponseDto } from './library.dto';
|
||||
|
||||
export const ILibraryRepository = 'ILibraryRepository';
|
||||
|
||||
export interface ILibraryRepository {
|
||||
getCountForUser(ownerId: string): Promise<number>;
|
||||
getAllByUserId(userId: string, type?: LibraryType): Promise<LibraryEntity[]>;
|
||||
getAll(withDeleted?: boolean, type?: LibraryType): Promise<LibraryEntity[]>;
|
||||
getAllDeleted(): Promise<LibraryEntity[]>;
|
||||
get(id: string, withDeleted?: boolean): Promise<LibraryEntity | null>;
|
||||
create(library: Partial<LibraryEntity>): Promise<LibraryEntity>;
|
||||
delete(id: string): Promise<void>;
|
||||
softDelete(id: string): Promise<void>;
|
||||
getDefaultUploadLibrary(ownerId: string): Promise<LibraryEntity | null>;
|
||||
getUploadLibraryCount(ownerId: string): Promise<number>;
|
||||
update(library: Partial<LibraryEntity>): Promise<LibraryEntity>;
|
||||
getStatistics(id: string): Promise<LibraryStatsResponseDto>;
|
||||
getOnlineAssetPaths(id: string): Promise<string[]>;
|
||||
getAssetIds(id: string, withDeleted?: boolean): Promise<string[]>;
|
||||
existsByName(name: string, withDeleted?: boolean): Promise<boolean>;
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
@ -0,0 +1,468 @@
|
||||
import { AssetType, LibraryType } from '@app/infra/entities';
|
||||
import { BadRequestException, Inject, Injectable, Logger } from '@nestjs/common';
|
||||
import { R_OK } from 'node:constants';
|
||||
import { Stats } from 'node:fs';
|
||||
import path from 'node:path';
|
||||
import { basename, parse } from 'path';
|
||||
import { AccessCore, IAccessRepository, Permission } from '../access';
|
||||
import { IAssetRepository, WithProperty } from '../asset';
|
||||
import { AuthUserDto } from '../auth';
|
||||
import { usePagination } from '../domain.util';
|
||||
|
||||
import { ICryptoRepository } from '../crypto';
|
||||
import { mimeTypes } from '../domain.constant';
|
||||
import {
|
||||
IBaseJob,
|
||||
IEntityJob,
|
||||
IJobRepository,
|
||||
ILibraryFileJob,
|
||||
ILibraryRefreshJob,
|
||||
IOfflineLibraryFileJob,
|
||||
JobName,
|
||||
JOBS_ASSET_PAGINATION_SIZE,
|
||||
} from '../job';
|
||||
import { IStorageRepository } from '../storage';
|
||||
import { IUserRepository } from '../user';
|
||||
import {
|
||||
CreateLibraryDto,
|
||||
LibraryResponseDto,
|
||||
LibraryStatsResponseDto,
|
||||
mapLibrary,
|
||||
ScanLibraryDto,
|
||||
UpdateLibraryDto,
|
||||
} from './library.dto';
|
||||
import { ILibraryRepository } from './library.repository';
|
||||
|
||||
@Injectable()
|
||||
export class LibraryService {
|
||||
readonly logger = new Logger(LibraryService.name);
|
||||
private access: AccessCore;
|
||||
|
||||
constructor(
|
||||
@Inject(IAccessRepository) accessRepository: IAccessRepository,
|
||||
@Inject(IAssetRepository) private assetRepository: IAssetRepository,
|
||||
@Inject(ICryptoRepository) private cryptoRepository: ICryptoRepository,
|
||||
@Inject(IJobRepository) private jobRepository: IJobRepository,
|
||||
@Inject(ILibraryRepository) private repository: ILibraryRepository,
|
||||
@Inject(IStorageRepository) private storageRepository: IStorageRepository,
|
||||
@Inject(IUserRepository) private userRepository: IUserRepository,
|
||||
) {
|
||||
this.access = new AccessCore(accessRepository);
|
||||
}
|
||||
|
||||
async getStatistics(authUser: AuthUserDto, id: string): Promise<LibraryStatsResponseDto> {
|
||||
await this.access.requirePermission(authUser, Permission.LIBRARY_READ, id);
|
||||
return this.repository.getStatistics(id);
|
||||
}
|
||||
|
||||
async getCount(authUser: AuthUserDto): Promise<number> {
|
||||
return this.repository.getCountForUser(authUser.id);
|
||||
}
|
||||
|
||||
async getAllForUser(authUser: AuthUserDto): Promise<LibraryResponseDto[]> {
|
||||
const libraries = await this.repository.getAllByUserId(authUser.id);
|
||||
return libraries.map((library) => mapLibrary(library));
|
||||
}
|
||||
|
||||
async get(authUser: AuthUserDto, id: string): Promise<LibraryResponseDto> {
|
||||
await this.access.requirePermission(authUser, Permission.LIBRARY_READ, id);
|
||||
const library = await this.findOrFail(id);
|
||||
return mapLibrary(library);
|
||||
}
|
||||
|
||||
async handleQueueCleanup(): Promise<boolean> {
|
||||
this.logger.debug('Cleaning up any pending library deletions');
|
||||
const pendingDeletion = await this.repository.getAllDeleted();
|
||||
for (const libraryToDelete of pendingDeletion) {
|
||||
await this.jobRepository.queue({ name: JobName.LIBRARY_DELETE, data: { id: libraryToDelete.id } });
|
||||
}
|
||||
return true;
|
||||
}
|
||||
|
||||
async create(authUser: AuthUserDto, dto: CreateLibraryDto): Promise<LibraryResponseDto> {
|
||||
switch (dto.type) {
|
||||
case LibraryType.EXTERNAL:
|
||||
if (!dto.name) {
|
||||
dto.name = 'New External Library';
|
||||
}
|
||||
break;
|
||||
case LibraryType.UPLOAD:
|
||||
if (!dto.name) {
|
||||
dto.name = 'New Upload Library';
|
||||
}
|
||||
if (dto.importPaths && dto.importPaths.length > 0) {
|
||||
throw new BadRequestException('Upload libraries cannot have import paths');
|
||||
}
|
||||
if (dto.exclusionPatterns && dto.exclusionPatterns.length > 0) {
|
||||
throw new BadRequestException('Upload libraries cannot have exclusion patterns');
|
||||
}
|
||||
break;
|
||||
}
|
||||
|
||||
const library = await this.repository.create({
|
||||
ownerId: authUser.id,
|
||||
name: dto.name,
|
||||
type: dto.type,
|
||||
importPaths: dto.importPaths ?? [],
|
||||
exclusionPatterns: dto.exclusionPatterns ?? [],
|
||||
isVisible: dto.isVisible ?? true,
|
||||
});
|
||||
|
||||
return mapLibrary(library);
|
||||
}
|
||||
|
||||
async update(authUser: AuthUserDto, id: string, dto: UpdateLibraryDto): Promise<LibraryResponseDto> {
|
||||
await this.access.requirePermission(authUser, Permission.LIBRARY_UPDATE, id);
|
||||
const library = await this.repository.update({ id, ...dto });
|
||||
return mapLibrary(library);
|
||||
}
|
||||
|
||||
async delete(authUser: AuthUserDto, id: string) {
|
||||
await this.access.requirePermission(authUser, Permission.LIBRARY_DELETE, id);
|
||||
|
||||
const library = await this.findOrFail(id);
|
||||
const uploadCount = await this.repository.getUploadLibraryCount(authUser.id);
|
||||
if (library.type === LibraryType.UPLOAD && uploadCount <= 1) {
|
||||
throw new BadRequestException('Cannot delete the last upload library');
|
||||
}
|
||||
|
||||
await this.repository.softDelete(id);
|
||||
await this.jobRepository.queue({ name: JobName.LIBRARY_DELETE, data: { id } });
|
||||
}
|
||||
|
||||
async handleDeleteLibrary(job: IEntityJob): Promise<boolean> {
|
||||
const library = await this.repository.get(job.id, true);
|
||||
if (!library) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// TODO use pagination
|
||||
const assetIds = await this.repository.getAssetIds(job.id);
|
||||
this.logger.debug(`Will delete ${assetIds.length} asset(s) in library ${job.id}`);
|
||||
// TODO queue a job for asset deletion
|
||||
await this.deleteAssets(assetIds);
|
||||
this.logger.log(`Deleting library ${job.id}`);
|
||||
await this.repository.delete(job.id);
|
||||
return true;
|
||||
}
|
||||
|
||||
async handleAssetRefresh(job: ILibraryFileJob) {
|
||||
const assetPath = path.normalize(job.assetPath);
|
||||
|
||||
const user = await this.userRepository.get(job.ownerId);
|
||||
if (!user?.externalPath) {
|
||||
this.logger.warn('User has no external path set, cannot import asset');
|
||||
return false;
|
||||
}
|
||||
|
||||
if (!path.normalize(assetPath).match(new RegExp(`^${user.externalPath}`))) {
|
||||
this.logger.error("Asset must be within the user's external path");
|
||||
return false;
|
||||
}
|
||||
|
||||
const existingAssetEntity = await this.assetRepository.getByLibraryIdAndOriginalPath(job.id, assetPath);
|
||||
|
||||
let stats: Stats;
|
||||
try {
|
||||
stats = await this.storageRepository.stat(assetPath);
|
||||
} catch (error: Error | any) {
|
||||
// Can't access file, probably offline
|
||||
if (existingAssetEntity) {
|
||||
// Mark asset as offline
|
||||
this.logger.debug(`Marking asset as offline: ${assetPath}`);
|
||||
|
||||
await this.assetRepository.save({ id: existingAssetEntity.id, isOffline: true });
|
||||
return true;
|
||||
} else {
|
||||
// File can't be accessed and does not already exist in db
|
||||
throw new BadRequestException("Can't access file", { cause: error });
|
||||
}
|
||||
}
|
||||
|
||||
let doImport = false;
|
||||
let doRefresh = false;
|
||||
|
||||
if (job.forceRefresh) {
|
||||
doRefresh = true;
|
||||
}
|
||||
|
||||
if (!existingAssetEntity) {
|
||||
// This asset is new to us, read it from disk
|
||||
this.logger.debug(`Importing new asset: ${assetPath}`);
|
||||
doImport = true;
|
||||
} else if (stats.mtime.toISOString() !== existingAssetEntity.fileModifiedAt.toISOString()) {
|
||||
// File modification time has changed since last time we checked, re-read from disk
|
||||
this.logger.debug(
|
||||
`File modification time has changed, re-importing asset: ${assetPath}. Old mtime: ${existingAssetEntity.fileModifiedAt}. New mtime: ${stats.mtime}`,
|
||||
);
|
||||
doRefresh = true;
|
||||
} else if (!job.forceRefresh && stats && !existingAssetEntity.isOffline) {
|
||||
// Asset exists on disk and in db and mtime has not changed. Also, we are not forcing refresn. Therefore, do nothing
|
||||
this.logger.debug(`Asset already exists in database and on disk, will not import: ${assetPath}`);
|
||||
}
|
||||
|
||||
if (stats && existingAssetEntity?.isOffline) {
|
||||
// File was previously offline but is now online
|
||||
this.logger.debug(`Marking previously-offline asset as online: ${assetPath}`);
|
||||
await this.assetRepository.save({ id: existingAssetEntity.id, isOffline: false });
|
||||
doRefresh = true;
|
||||
}
|
||||
|
||||
if (!doImport && !doRefresh) {
|
||||
// If we don't import, exit here
|
||||
return true;
|
||||
}
|
||||
|
||||
let assetType: AssetType;
|
||||
|
||||
if (mimeTypes.isImage(assetPath)) {
|
||||
assetType = AssetType.IMAGE;
|
||||
} else if (mimeTypes.isVideo(assetPath)) {
|
||||
assetType = AssetType.VIDEO;
|
||||
} else {
|
||||
throw new BadRequestException(`Unsupported file type ${assetPath}`);
|
||||
}
|
||||
|
||||
// TODO: doesn't xmp replace the file extension? Will need investigation
|
||||
let sidecarPath: string | null = null;
|
||||
if (await this.storageRepository.checkFileExists(`${assetPath}.xmp`, R_OK)) {
|
||||
sidecarPath = `${assetPath}.xmp`;
|
||||
}
|
||||
|
||||
const deviceAssetId = `${basename(assetPath)}`.replace(/\s+/g, '');
|
||||
|
||||
const pathHash = this.cryptoRepository.hashSha1(`path:${assetPath}`);
|
||||
|
||||
let assetId;
|
||||
if (doImport) {
|
||||
const library = await this.repository.get(job.id, true);
|
||||
if (library?.deletedAt) {
|
||||
this.logger.error('Cannot import asset into deleted library');
|
||||
return false;
|
||||
}
|
||||
|
||||
// TODO: In wait of refactoring the domain asset service, this function is just manually written like this
|
||||
const addedAsset = await this.assetRepository.create({
|
||||
ownerId: job.ownerId,
|
||||
libraryId: job.id,
|
||||
checksum: pathHash,
|
||||
originalPath: assetPath,
|
||||
deviceAssetId: deviceAssetId,
|
||||
deviceId: 'Library Import',
|
||||
fileCreatedAt: stats.ctime,
|
||||
fileModifiedAt: stats.mtime,
|
||||
type: assetType,
|
||||
originalFileName: parse(assetPath).name,
|
||||
sidecarPath,
|
||||
isReadOnly: true,
|
||||
isExternal: true,
|
||||
});
|
||||
assetId = addedAsset.id;
|
||||
} else if (doRefresh && existingAssetEntity) {
|
||||
assetId = existingAssetEntity.id;
|
||||
await this.assetRepository.updateAll([existingAssetEntity.id], {
|
||||
fileCreatedAt: stats.ctime,
|
||||
fileModifiedAt: stats.mtime,
|
||||
});
|
||||
} else {
|
||||
// Not importing and not refreshing, do nothing
|
||||
return true;
|
||||
}
|
||||
|
||||
this.logger.debug(`Queuing metadata extraction for: ${assetPath}`);
|
||||
|
||||
await this.jobRepository.queue({ name: JobName.METADATA_EXTRACTION, data: { id: assetId, source: 'upload' } });
|
||||
|
||||
if (assetType === AssetType.VIDEO) {
|
||||
await this.jobRepository.queue({ name: JobName.VIDEO_CONVERSION, data: { id: assetId } });
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
async queueScan(authUser: AuthUserDto, id: string, dto: ScanLibraryDto) {
|
||||
await this.access.requirePermission(authUser, Permission.LIBRARY_UPDATE, id);
|
||||
|
||||
const library = await this.repository.get(id);
|
||||
if (!library || library.type !== LibraryType.EXTERNAL) {
|
||||
throw new BadRequestException('Can only refresh external libraries');
|
||||
}
|
||||
|
||||
await this.jobRepository.queue({
|
||||
name: JobName.LIBRARY_SCAN,
|
||||
data: {
|
||||
id,
|
||||
refreshModifiedFiles: dto.refreshModifiedFiles ?? false,
|
||||
refreshAllFiles: dto.refreshAllFiles ?? false,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
async queueRemoveOffline(authUser: AuthUserDto, id: string) {
|
||||
this.logger.verbose(`Removing offline files from library: ${id}`);
|
||||
await this.access.requirePermission(authUser, Permission.LIBRARY_UPDATE, id);
|
||||
|
||||
await this.jobRepository.queue({
|
||||
name: JobName.LIBRARY_REMOVE_OFFLINE,
|
||||
data: {
|
||||
id,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
async handleQueueAllScan(job: IBaseJob): Promise<boolean> {
|
||||
this.logger.debug(`Refreshing all external libraries: force=${job.force}`);
|
||||
|
||||
// Queue cleanup
|
||||
await this.jobRepository.queue({ name: JobName.LIBRARY_QUEUE_CLEANUP, data: {} });
|
||||
|
||||
// Queue all library refresh
|
||||
const libraries = await this.repository.getAll(true, LibraryType.EXTERNAL);
|
||||
for (const library of libraries) {
|
||||
await this.jobRepository.queue({
|
||||
name: JobName.LIBRARY_SCAN,
|
||||
data: {
|
||||
id: library.id,
|
||||
refreshModifiedFiles: !job.force,
|
||||
refreshAllFiles: job.force ?? false,
|
||||
},
|
||||
});
|
||||
}
|
||||
return true;
|
||||
}
|
||||
|
||||
async handleOfflineRemoval(job: IEntityJob): Promise<boolean> {
|
||||
const assetPagination = usePagination(JOBS_ASSET_PAGINATION_SIZE, (pagination) => {
|
||||
return this.assetRepository.getWith(pagination, WithProperty.IS_OFFLINE, job.id);
|
||||
});
|
||||
|
||||
const assetIds: string[] = [];
|
||||
|
||||
for await (const assets of assetPagination) {
|
||||
for (const asset of assets) {
|
||||
assetIds.push(asset.id);
|
||||
}
|
||||
}
|
||||
|
||||
this.logger.verbose(`Found ${assetIds.length} offline assets to remove`);
|
||||
await this.deleteAssets(assetIds);
|
||||
return true;
|
||||
}
|
||||
|
||||
async handleQueueAssetRefresh(job: ILibraryRefreshJob): Promise<boolean> {
|
||||
const library = await this.repository.get(job.id);
|
||||
if (!library || library.type !== LibraryType.EXTERNAL) {
|
||||
this.logger.warn('Can only refresh external libraries');
|
||||
return false;
|
||||
}
|
||||
|
||||
const user = await this.userRepository.get(library.ownerId);
|
||||
if (!user?.externalPath) {
|
||||
this.logger.warn('User has no external path set, cannot refresh library');
|
||||
return false;
|
||||
}
|
||||
|
||||
this.logger.verbose(`Refreshing library: ${job.id}`);
|
||||
const crawledAssetPaths = (
|
||||
await this.storageRepository.crawl({
|
||||
pathsToCrawl: library.importPaths,
|
||||
exclusionPatterns: library.exclusionPatterns,
|
||||
})
|
||||
)
|
||||
.map(path.normalize)
|
||||
.filter((assetPath) =>
|
||||
// Filter out paths that are not within the user's external path
|
||||
assetPath.match(new RegExp(`^${user.externalPath}`)),
|
||||
);
|
||||
|
||||
this.logger.debug(`Found ${crawledAssetPaths.length} assets when crawling import paths ${library.importPaths}`);
|
||||
const assetsInLibrary = await this.assetRepository.getByLibraryId([job.id]);
|
||||
const offlineAssets = assetsInLibrary.filter((asset) => !crawledAssetPaths.includes(asset.originalPath));
|
||||
this.logger.debug(`${offlineAssets.length} assets in library are not present on disk and will be marked offline`);
|
||||
|
||||
for (const offlineAsset of offlineAssets) {
|
||||
const offlineJobData: IOfflineLibraryFileJob = {
|
||||
id: job.id,
|
||||
assetPath: offlineAsset.originalPath,
|
||||
};
|
||||
|
||||
await this.jobRepository.queue({ name: JobName.LIBRARY_MARK_ASSET_OFFLINE, data: offlineJobData });
|
||||
}
|
||||
|
||||
if (crawledAssetPaths.length > 0) {
|
||||
let filteredPaths: string[] = [];
|
||||
if (job.refreshAllFiles || job.refreshModifiedFiles) {
|
||||
filteredPaths = crawledAssetPaths;
|
||||
} else {
|
||||
const existingPaths = await this.repository.getOnlineAssetPaths(job.id);
|
||||
this.logger.debug(`Found ${existingPaths.length} existing asset(s) in library ${job.id}`);
|
||||
|
||||
filteredPaths = crawledAssetPaths.filter((assetPath) => !existingPaths.includes(assetPath));
|
||||
this.logger.debug(`After db comparison, ${filteredPaths.length} asset(s) remain to be imported`);
|
||||
}
|
||||
|
||||
for (const assetPath of filteredPaths) {
|
||||
const libraryJobData: ILibraryFileJob = {
|
||||
id: job.id,
|
||||
assetPath: path.normalize(assetPath),
|
||||
ownerId: library.ownerId,
|
||||
forceRefresh: job.refreshAllFiles ?? false,
|
||||
};
|
||||
|
||||
await this.jobRepository.queue({ name: JobName.LIBRARY_SCAN_ASSET, data: libraryJobData });
|
||||
}
|
||||
}
|
||||
|
||||
await this.repository.update({ id: job.id, refreshedAt: new Date() });
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
async handleOfflineAsset(job: IOfflineLibraryFileJob): Promise<boolean> {
|
||||
const existingAssetEntity = await this.assetRepository.getByLibraryIdAndOriginalPath(job.id, job.assetPath);
|
||||
|
||||
if (existingAssetEntity) {
|
||||
this.logger.verbose(`Marking asset as offline: ${job.assetPath}`);
|
||||
await this.assetRepository.save({ id: existingAssetEntity.id, isOffline: true });
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
private async findOrFail(id: string) {
|
||||
const library = await this.repository.get(id);
|
||||
if (!library) {
|
||||
throw new BadRequestException('Library not found');
|
||||
}
|
||||
return library;
|
||||
}
|
||||
|
||||
private async deleteAssets(assetIds: string[]) {
|
||||
// TODO: this should be refactored to a centralized asset deletion service
|
||||
for (const assetId of assetIds) {
|
||||
const asset = await this.assetRepository.getById(assetId);
|
||||
this.logger.debug(`Removing asset from library: ${asset.originalPath}`);
|
||||
|
||||
if (asset.faces) {
|
||||
await Promise.all(
|
||||
asset.faces.map(({ assetId, personId }) =>
|
||||
this.jobRepository.queue({ name: JobName.SEARCH_REMOVE_FACE, data: { assetId, personId } }),
|
||||
),
|
||||
);
|
||||
}
|
||||
|
||||
await this.assetRepository.remove(asset);
|
||||
await this.jobRepository.queue({ name: JobName.SEARCH_REMOVE_ASSET, data: { ids: [asset.id] } });
|
||||
|
||||
await this.jobRepository.queue({
|
||||
name: JobName.DELETE_FILES,
|
||||
data: { files: [asset.webpPath, asset.resizePath, asset.encodedVideoPath, asset.sidecarPath] },
|
||||
});
|
||||
|
||||
// TODO refactor this to use cascades
|
||||
if (asset.livePhotoVideoId && !assetIds.includes(asset.livePhotoVideoId)) {
|
||||
assetIds.push(asset.livePhotoVideoId);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -0,0 +1,69 @@
|
||||
import {
|
||||
AuthUserDto,
|
||||
CreateLibraryDto as CreateDto,
|
||||
LibraryService,
|
||||
LibraryStatsResponseDto,
|
||||
LibraryResponseDto as ResponseDto,
|
||||
ScanLibraryDto,
|
||||
UpdateLibraryDto as UpdateDto,
|
||||
} from '@app/domain';
|
||||
import { Body, Controller, Delete, Get, Param, Post, Put } from '@nestjs/common';
|
||||
import { ApiTags } from '@nestjs/swagger';
|
||||
import { AuthUser, Authenticated } from '../app.guard';
|
||||
import { UseValidation } from '../app.utils';
|
||||
import { UUIDParamDto } from './dto/uuid-param.dto';
|
||||
|
||||
@ApiTags('Library')
|
||||
@Controller('library')
|
||||
@Authenticated()
|
||||
@UseValidation()
|
||||
export class LibraryController {
|
||||
constructor(private service: LibraryService) {}
|
||||
|
||||
@Get()
|
||||
getAllForUser(@AuthUser() authUser: AuthUserDto): Promise<ResponseDto[]> {
|
||||
return this.service.getAllForUser(authUser);
|
||||
}
|
||||
|
||||
@Post()
|
||||
createLibrary(@AuthUser() authUser: AuthUserDto, @Body() dto: CreateDto): Promise<ResponseDto> {
|
||||
return this.service.create(authUser, dto);
|
||||
}
|
||||
|
||||
@Put(':id')
|
||||
updateLibrary(
|
||||
@AuthUser() authUser: AuthUserDto,
|
||||
@Param() { id }: UUIDParamDto,
|
||||
@Body() dto: UpdateDto,
|
||||
): Promise<ResponseDto> {
|
||||
return this.service.update(authUser, id, dto);
|
||||
}
|
||||
|
||||
@Get(':id')
|
||||
getLibraryInfo(@AuthUser() authUser: AuthUserDto, @Param() { id }: UUIDParamDto): Promise<ResponseDto> {
|
||||
return this.service.get(authUser, id);
|
||||
}
|
||||
|
||||
@Delete(':id')
|
||||
deleteLibrary(@AuthUser() authUser: AuthUserDto, @Param() { id }: UUIDParamDto): Promise<void> {
|
||||
return this.service.delete(authUser, id);
|
||||
}
|
||||
|
||||
@Get(':id/statistics')
|
||||
getLibraryStatistics(
|
||||
@AuthUser() authUser: AuthUserDto,
|
||||
@Param() { id }: UUIDParamDto,
|
||||
): Promise<LibraryStatsResponseDto> {
|
||||
return this.service.getStatistics(authUser, id);
|
||||
}
|
||||
|
||||
@Post(':id/scan')
|
||||
scanLibrary(@AuthUser() authUser: AuthUserDto, @Param() { id }: UUIDParamDto, @Body() dto: ScanLibraryDto) {
|
||||
return this.service.queueScan(authUser, id, dto);
|
||||
}
|
||||
|
||||
@Post(':id/removeOffline')
|
||||
removeOfflineFiles(@AuthUser() authUser: AuthUserDto, @Param() { id }: UUIDParamDto) {
|
||||
return this.service.queueRemoveOffline(authUser, id);
|
||||
}
|
||||
}
|
||||
@ -0,0 +1,61 @@
|
||||
import {
|
||||
Column,
|
||||
CreateDateColumn,
|
||||
DeleteDateColumn,
|
||||
Entity,
|
||||
JoinTable,
|
||||
ManyToOne,
|
||||
OneToMany,
|
||||
PrimaryGeneratedColumn,
|
||||
UpdateDateColumn,
|
||||
} from 'typeorm';
|
||||
import { AssetEntity } from './asset.entity';
|
||||
import { UserEntity } from './user.entity';
|
||||
|
||||
@Entity('libraries')
|
||||
export class LibraryEntity {
|
||||
@PrimaryGeneratedColumn('uuid')
|
||||
id!: string;
|
||||
|
||||
@Column()
|
||||
name!: string;
|
||||
|
||||
@OneToMany(() => AssetEntity, (asset) => asset.library)
|
||||
@JoinTable()
|
||||
assets!: AssetEntity[];
|
||||
|
||||
@ManyToOne(() => UserEntity, { onDelete: 'CASCADE', onUpdate: 'CASCADE', nullable: false })
|
||||
owner!: UserEntity;
|
||||
|
||||
@Column()
|
||||
ownerId!: string;
|
||||
|
||||
@Column()
|
||||
type!: LibraryType;
|
||||
|
||||
@Column('text', { array: true })
|
||||
importPaths!: string[];
|
||||
|
||||
@Column('text', { array: true })
|
||||
exclusionPatterns!: string[];
|
||||
|
||||
@CreateDateColumn({ type: 'timestamptz' })
|
||||
createdAt!: Date;
|
||||
|
||||
@UpdateDateColumn({ type: 'timestamptz' })
|
||||
updatedAt!: Date;
|
||||
|
||||
@DeleteDateColumn({ type: 'timestamptz' })
|
||||
deletedAt?: Date;
|
||||
|
||||
@Column({ type: 'timestamptz', nullable: true })
|
||||
refreshedAt!: Date | null;
|
||||
|
||||
@Column({ type: 'boolean', default: true })
|
||||
isVisible!: boolean;
|
||||
}
|
||||
|
||||
export enum LibraryType {
|
||||
UPLOAD = 'UPLOAD',
|
||||
EXTERNAL = 'EXTERNAL',
|
||||
}
|
||||
@ -0,0 +1,57 @@
|
||||
import { MigrationInterface, QueryRunner } from 'typeorm';
|
||||
|
||||
export class AddLibraries1688392120838 implements MigrationInterface {
|
||||
name = 'AddLibraryTable1688392120838';
|
||||
|
||||
public async up(queryRunner: QueryRunner): Promise<void> {
|
||||
await queryRunner.query(`ALTER TABLE "assets" DROP CONSTRAINT "UQ_userid_checksum"`);
|
||||
await queryRunner.query(
|
||||
`CREATE TABLE "libraries" ("id" uuid NOT NULL DEFAULT uuid_generate_v4(), "name" character varying NOT NULL, "ownerId" uuid NOT NULL, "type" character varying NOT NULL, "importPaths" text array NOT NULL, "exclusionPatterns" text array NOT NULL, "createdAt" TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT now(), "updatedAt" TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT now(), "deletedAt" TIMESTAMP WITH TIME ZONE, "refreshedAt" TIMESTAMP WITH TIME ZONE, "isVisible" boolean NOT NULL DEFAULT true, CONSTRAINT "PK_505fedfcad00a09b3734b4223de" PRIMARY KEY ("id"))`,
|
||||
);
|
||||
await queryRunner.query(`ALTER TABLE "assets" ADD "isOffline" boolean NOT NULL DEFAULT false`);
|
||||
await queryRunner.query(`ALTER TABLE "assets" ADD "libraryId" uuid`);
|
||||
await queryRunner.query(`ALTER TABLE "assets" DROP CONSTRAINT "UQ_4ed4f8052685ff5b1e7ca1058ba"`);
|
||||
await queryRunner.query(`ALTER TABLE "assets" ADD "isExternal" boolean NOT NULL DEFAULT false`);
|
||||
|
||||
await queryRunner.query(
|
||||
`CREATE UNIQUE INDEX "UQ_assets_owner_library_checksum" on "assets" ("ownerId", "libraryId", checksum)`,
|
||||
);
|
||||
await queryRunner.query(
|
||||
`ALTER TABLE "libraries" ADD CONSTRAINT "FK_0f6fc2fb195f24d19b0fb0d57c1" FOREIGN KEY ("ownerId") REFERENCES "users"("id") ON DELETE CASCADE ON UPDATE CASCADE`,
|
||||
);
|
||||
await queryRunner.query(
|
||||
`ALTER TABLE "assets" ADD CONSTRAINT "FK_9977c3c1de01c3d848039a6b90c" FOREIGN KEY ("libraryId") REFERENCES "libraries"("id") ON DELETE CASCADE ON UPDATE CASCADE`,
|
||||
);
|
||||
|
||||
// Create default library for each user and assign all assets to it
|
||||
const userIds: string[] = (await queryRunner.query(`SELECT id FROM "users"`)).map((user: any) => user.id);
|
||||
|
||||
for (const userId of userIds) {
|
||||
await queryRunner.query(
|
||||
`INSERT INTO "libraries" ("name", "ownerId", "type", "importPaths", "exclusionPatterns") VALUES ('Default Library', '${userId}', 'UPLOAD', '{}', '{}')`,
|
||||
);
|
||||
|
||||
await queryRunner.query(
|
||||
`UPDATE "assets" SET "libraryId" = (SELECT id FROM "libraries" WHERE "ownerId" = '${userId}' LIMIT 1) WHERE "ownerId" = '${userId}'`,
|
||||
);
|
||||
}
|
||||
|
||||
await queryRunner.query(`ALTER TABLE "assets" ALTER COLUMN "libraryId" SET NOT NULL`);
|
||||
}
|
||||
|
||||
public async down(queryRunner: QueryRunner): Promise<void> {
|
||||
await queryRunner.query(`ALTER TABLE "assets" ALTER COLUMN "libraryId" DROP NOT NULL`);
|
||||
await queryRunner.query(`ALTER TABLE "assets" DROP CONSTRAINT "FK_9977c3c1de01c3d848039a6b90c"`);
|
||||
await queryRunner.query(`ALTER TABLE "libraries" DROP CONSTRAINT "FK_0f6fc2fb195f24d19b0fb0d57c1"`);
|
||||
await queryRunner.query(`DROP INDEX "UQ_assets_owner_library_checksum"`);
|
||||
await queryRunner.query(`ALTER TABLE "assets" DROP CONSTRAINT "UQ_owner_library_originalpath"`);
|
||||
await queryRunner.query(
|
||||
`ALTER TABLE "assets" ADD CONSTRAINT "UQ_4ed4f8052685ff5b1e7ca1058ba" UNIQUE ("originalPath")`,
|
||||
);
|
||||
await queryRunner.query(`ALTER TABLE "assets" DROP COLUMN "libraryId"`);
|
||||
await queryRunner.query(`ALTER TABLE "assets" DROP COLUMN "isOffline"`);
|
||||
await queryRunner.query(`ALTER TABLE "assets" DROP COLUMN "isExternal"`);
|
||||
await queryRunner.query(`DROP TABLE "libraries"`);
|
||||
await queryRunner.query(`ALTER TABLE "assets" ADD CONSTRAINT "UQ_userid_checksum" UNIQUE ("ownerId", "checksum")`);
|
||||
}
|
||||
}
|
||||
@ -0,0 +1,209 @@
|
||||
import { CrawlOptionsDto } from '@app/domain';
|
||||
import mockfs from 'mock-fs';
|
||||
import { FilesystemProvider } from './filesystem.provider';
|
||||
|
||||
describe(FilesystemProvider.name, () => {
|
||||
const sut: FilesystemProvider = new FilesystemProvider();
|
||||
|
||||
describe('crawl', () => {
|
||||
it('should return empty wnen crawling an empty path list', async () => {
|
||||
const options = new CrawlOptionsDto();
|
||||
options.pathsToCrawl = [];
|
||||
const paths: string[] = await sut.crawl(options);
|
||||
expect(paths).toHaveLength(0);
|
||||
});
|
||||
|
||||
it('should crawl a single path', async () => {
|
||||
mockfs({
|
||||
'/photos/image.jpg': '',
|
||||
});
|
||||
|
||||
const options = new CrawlOptionsDto();
|
||||
options.pathsToCrawl = ['/photos/'];
|
||||
const paths: string[] = await sut.crawl(options);
|
||||
expect(paths.sort()).toEqual(['/photos/image.jpg'].sort());
|
||||
});
|
||||
|
||||
it('should exclude by file extension', async () => {
|
||||
mockfs({
|
||||
'/photos/image.jpg': '',
|
||||
'/photos/image.tif': '',
|
||||
});
|
||||
|
||||
const options = new CrawlOptionsDto();
|
||||
options.pathsToCrawl = ['/photos/'];
|
||||
options.exclusionPatterns = ['**/*.tif'];
|
||||
const paths: string[] = await sut.crawl(options);
|
||||
expect(paths.sort()).toEqual(['/photos/image.jpg'].sort());
|
||||
});
|
||||
|
||||
it('should exclude by file extension without case sensitivity', async () => {
|
||||
mockfs({
|
||||
'/photos/image.jpg': '',
|
||||
'/photos/image.tif': '',
|
||||
});
|
||||
|
||||
const options = new CrawlOptionsDto();
|
||||
options.pathsToCrawl = ['/photos/'];
|
||||
options.exclusionPatterns = ['**/*.TIF'];
|
||||
const paths: string[] = await sut.crawl(options);
|
||||
expect(paths.sort()).toEqual(['/photos/image.jpg'].sort());
|
||||
});
|
||||
|
||||
it('should exclude by folder', async () => {
|
||||
mockfs({
|
||||
'/photos/image.jpg': '',
|
||||
'/photos/raw/image.jpg': '',
|
||||
'/photos/raw2/image.jpg': '',
|
||||
'/photos/folder/raw/image.jpg': '',
|
||||
'/photos/crawl/image.jpg': '',
|
||||
});
|
||||
|
||||
const options = new CrawlOptionsDto();
|
||||
options.pathsToCrawl = ['/photos/'];
|
||||
options.exclusionPatterns = ['**/raw/**'];
|
||||
const paths: string[] = await sut.crawl(options);
|
||||
expect(paths.sort()).toEqual(['/photos/image.jpg', '/photos/raw2/image.jpg', '/photos/crawl/image.jpg'].sort());
|
||||
});
|
||||
|
||||
it('should crawl multiple paths', async () => {
|
||||
mockfs({
|
||||
'/photos/image1.jpg': '',
|
||||
'/images/image2.jpg': '',
|
||||
'/albums/image3.jpg': '',
|
||||
});
|
||||
const options = new CrawlOptionsDto();
|
||||
options.pathsToCrawl = ['/photos/', '/images/', '/albums/'];
|
||||
const paths: string[] = await sut.crawl(options);
|
||||
expect(paths.sort()).toEqual(['/photos/image1.jpg', '/images/image2.jpg', '/albums/image3.jpg'].sort());
|
||||
});
|
||||
|
||||
it('should support globbing paths', async () => {
|
||||
mockfs({
|
||||
'/photos1/image1.jpg': '',
|
||||
'/photos2/image2.jpg': '',
|
||||
'/images/image3.jpg': '',
|
||||
});
|
||||
const options = new CrawlOptionsDto();
|
||||
options.pathsToCrawl = ['/photos*'];
|
||||
const paths: string[] = await sut.crawl(options);
|
||||
expect(paths.sort()).toEqual(['/photos1/image1.jpg', '/photos2/image2.jpg'].sort());
|
||||
});
|
||||
|
||||
it('should crawl a single path without trailing slash', async () => {
|
||||
mockfs({
|
||||
'/photos/image.jpg': '',
|
||||
});
|
||||
const options = new CrawlOptionsDto();
|
||||
options.pathsToCrawl = ['/photos'];
|
||||
const paths: string[] = await sut.crawl(options);
|
||||
expect(paths.sort()).toEqual(['/photos/image.jpg'].sort());
|
||||
});
|
||||
|
||||
// TODO: test for hidden paths (not yet implemented)
|
||||
|
||||
it('should crawl a single path', async () => {
|
||||
mockfs({
|
||||
'/photos/image.jpg': '',
|
||||
'/photos/subfolder/image1.jpg': '',
|
||||
'/photos/subfolder/image2.jpg': '',
|
||||
'/image1.jpg': '',
|
||||
});
|
||||
const options = new CrawlOptionsDto();
|
||||
options.pathsToCrawl = ['/photos/'];
|
||||
const paths: string[] = await sut.crawl(options);
|
||||
expect(paths.sort()).toEqual(
|
||||
['/photos/image.jpg', '/photos/subfolder/image1.jpg', '/photos/subfolder/image2.jpg'].sort(),
|
||||
);
|
||||
});
|
||||
|
||||
it('should filter file extensions', async () => {
|
||||
mockfs({
|
||||
'/photos/image.jpg': '',
|
||||
'/photos/image.txt': '',
|
||||
'/photos/1': '',
|
||||
});
|
||||
const options = new CrawlOptionsDto();
|
||||
options.pathsToCrawl = ['/photos/'];
|
||||
const paths: string[] = await sut.crawl(options);
|
||||
expect(paths.sort()).toEqual(['/photos/image.jpg'].sort());
|
||||
});
|
||||
|
||||
it('should include photo and video extensions', async () => {
|
||||
mockfs({
|
||||
'/photos/image.jpg': '',
|
||||
'/photos/image.jpeg': '',
|
||||
'/photos/image.heic': '',
|
||||
'/photos/image.heif': '',
|
||||
'/photos/image.png': '',
|
||||
'/photos/image.gif': '',
|
||||
'/photos/image.tif': '',
|
||||
'/photos/image.tiff': '',
|
||||
'/photos/image.webp': '',
|
||||
'/photos/image.dng': '',
|
||||
'/photos/image.nef': '',
|
||||
'/videos/video.mp4': '',
|
||||
'/videos/video.mov': '',
|
||||
'/videos/video.webm': '',
|
||||
});
|
||||
|
||||
const options = new CrawlOptionsDto();
|
||||
options.pathsToCrawl = ['/photos/', '/videos/'];
|
||||
const paths: string[] = await sut.crawl(options);
|
||||
|
||||
expect(paths.sort()).toEqual(
|
||||
[
|
||||
'/photos/image.jpg',
|
||||
'/photos/image.jpeg',
|
||||
'/photos/image.heic',
|
||||
'/photos/image.heif',
|
||||
'/photos/image.png',
|
||||
'/photos/image.gif',
|
||||
'/photos/image.tif',
|
||||
'/photos/image.tiff',
|
||||
'/photos/image.webp',
|
||||
'/photos/image.dng',
|
||||
'/photos/image.nef',
|
||||
'/videos/video.mp4',
|
||||
'/videos/video.mov',
|
||||
'/videos/video.webm',
|
||||
].sort(),
|
||||
);
|
||||
});
|
||||
|
||||
it('should check file extensions without case sensitivity', async () => {
|
||||
mockfs({
|
||||
'/photos/image.jpg': '',
|
||||
'/photos/image.Jpg': '',
|
||||
'/photos/image.jpG': '',
|
||||
'/photos/image.JPG': '',
|
||||
'/photos/image.jpEg': '',
|
||||
'/photos/image.TIFF': '',
|
||||
'/photos/image.tif': '',
|
||||
'/photos/image.dng': '',
|
||||
'/photos/image.NEF': '',
|
||||
});
|
||||
|
||||
const options = new CrawlOptionsDto();
|
||||
options.pathsToCrawl = ['/photos/'];
|
||||
const paths: string[] = await sut.crawl(options);
|
||||
expect(paths.sort()).toEqual(
|
||||
[
|
||||
'/photos/image.jpg',
|
||||
'/photos/image.Jpg',
|
||||
'/photos/image.jpG',
|
||||
'/photos/image.JPG',
|
||||
'/photos/image.jpEg',
|
||||
'/photos/image.TIFF',
|
||||
'/photos/image.tif',
|
||||
'/photos/image.dng',
|
||||
'/photos/image.NEF',
|
||||
].sort(),
|
||||
);
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
mockfs.restore();
|
||||
});
|
||||
});
|
||||
});
|
||||
@ -0,0 +1,183 @@
|
||||
import { ILibraryRepository, LibraryStatsResponseDto } from '@app/domain';
|
||||
import { Injectable } from '@nestjs/common';
|
||||
import { InjectRepository } from '@nestjs/typeorm';
|
||||
import { IsNull, Not } from 'typeorm';
|
||||
import { Repository } from 'typeorm/repository/Repository';
|
||||
import { LibraryEntity, LibraryType } from '../entities';
|
||||
|
||||
@Injectable()
|
||||
export class LibraryRepository implements ILibraryRepository {
|
||||
constructor(@InjectRepository(LibraryEntity) private repository: Repository<LibraryEntity>) {}
|
||||
|
||||
get(id: string, withDeleted = false): Promise<LibraryEntity | null> {
|
||||
return this.repository.findOneOrFail({
|
||||
where: {
|
||||
id,
|
||||
},
|
||||
relations: { owner: true },
|
||||
withDeleted,
|
||||
});
|
||||
}
|
||||
|
||||
existsByName(name: string, withDeleted = false): Promise<boolean> {
|
||||
return this.repository.exist({
|
||||
where: {
|
||||
name,
|
||||
},
|
||||
withDeleted,
|
||||
});
|
||||
}
|
||||
|
||||
getCountForUser(ownerId: string): Promise<number> {
|
||||
return this.repository.countBy({ ownerId });
|
||||
}
|
||||
|
||||
getDefaultUploadLibrary(ownerId: string): Promise<LibraryEntity | null> {
|
||||
return this.repository.findOne({
|
||||
where: {
|
||||
ownerId: ownerId,
|
||||
type: LibraryType.UPLOAD,
|
||||
},
|
||||
order: {
|
||||
createdAt: 'ASC',
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
getUploadLibraryCount(ownerId: string): Promise<number> {
|
||||
return this.repository.count({
|
||||
where: {
|
||||
ownerId: ownerId,
|
||||
type: LibraryType.UPLOAD,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
getAllByUserId(ownerId: string, type?: LibraryType): Promise<LibraryEntity[]> {
|
||||
return this.repository.find({
|
||||
where: {
|
||||
ownerId,
|
||||
isVisible: true,
|
||||
type,
|
||||
},
|
||||
relations: {
|
||||
owner: true,
|
||||
},
|
||||
order: {
|
||||
createdAt: 'ASC',
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
getAll(withDeleted = false, type?: LibraryType): Promise<LibraryEntity[]> {
|
||||
return this.repository.find({
|
||||
where: { type },
|
||||
relations: {
|
||||
owner: true,
|
||||
},
|
||||
order: {
|
||||
createdAt: 'ASC',
|
||||
},
|
||||
withDeleted,
|
||||
});
|
||||
}
|
||||
|
||||
getAllDeleted(): Promise<LibraryEntity[]> {
|
||||
return this.repository.find({
|
||||
where: {
|
||||
isVisible: true,
|
||||
deletedAt: Not(IsNull()),
|
||||
},
|
||||
relations: {
|
||||
owner: true,
|
||||
},
|
||||
order: {
|
||||
createdAt: 'ASC',
|
||||
},
|
||||
withDeleted: true,
|
||||
});
|
||||
}
|
||||
|
||||
create(library: Omit<LibraryEntity, 'id' | 'createdAt' | 'updatedAt' | 'ownerId'>): Promise<LibraryEntity> {
|
||||
return this.repository.save(library);
|
||||
}
|
||||
|
||||
async delete(id: string): Promise<void> {
|
||||
await this.repository.delete({ id });
|
||||
}
|
||||
|
||||
async softDelete(id: string): Promise<void> {
|
||||
await this.repository.softDelete({ id });
|
||||
}
|
||||
|
||||
async update(library: Partial<LibraryEntity>): Promise<LibraryEntity> {
|
||||
return this.save(library);
|
||||
}
|
||||
|
||||
async getStatistics(id: string): Promise<LibraryStatsResponseDto> {
|
||||
const stats = await this.repository
|
||||
.createQueryBuilder('libraries')
|
||||
.addSelect(`COUNT(assets.id) FILTER (WHERE assets.type = 'IMAGE' AND assets.isVisible)`, 'photos')
|
||||
.addSelect(`COUNT(assets.id) FILTER (WHERE assets.type = 'VIDEO' AND assets.isVisible)`, 'videos')
|
||||
.addSelect('COALESCE(SUM(exif.fileSizeInByte), 0)', 'usage')
|
||||
.leftJoin('libraries.assets', 'assets')
|
||||
.leftJoin('assets.exifInfo', 'exif')
|
||||
.groupBy('libraries.id')
|
||||
.where('libraries.id = :id', { id })
|
||||
.getRawOne();
|
||||
|
||||
return {
|
||||
photos: Number(stats.photos),
|
||||
videos: Number(stats.videos),
|
||||
usage: Number(stats.usage),
|
||||
total: Number(stats.photos) + Number(stats.videos),
|
||||
};
|
||||
}
|
||||
|
||||
async getOnlineAssetPaths(libraryId: string): Promise<string[]> {
|
||||
// Return all non-offline asset paths for a given library
|
||||
const rawResults = await this.repository
|
||||
.createQueryBuilder('library')
|
||||
.innerJoinAndSelect('library.assets', 'assets')
|
||||
.where('library.id = :id', { id: libraryId })
|
||||
.andWhere('assets.isOffline = false')
|
||||
.select('assets.originalPath')
|
||||
.getRawMany();
|
||||
|
||||
const results: string[] = [];
|
||||
|
||||
for (const rawPath of rawResults) {
|
||||
results.push(rawPath.assets_originalPath);
|
||||
}
|
||||
|
||||
return results;
|
||||
}
|
||||
|
||||
async getAssetIds(libraryId: string, withDeleted = false): Promise<string[]> {
|
||||
let query = await this.repository
|
||||
.createQueryBuilder('library')
|
||||
.innerJoinAndSelect('library.assets', 'assets')
|
||||
.where('library.id = :id', { id: libraryId })
|
||||
.select('assets.id');
|
||||
|
||||
if (withDeleted) {
|
||||
query = query.withDeleted();
|
||||
}
|
||||
|
||||
// Return all asset paths for a given library
|
||||
const rawResults = await query.getRawMany();
|
||||
|
||||
const results: string[] = [];
|
||||
|
||||
for (const rawPath of rawResults) {
|
||||
results.push(rawPath.assets_id);
|
||||
}
|
||||
|
||||
return results;
|
||||
}
|
||||
|
||||
private async save(library: Partial<LibraryEntity>) {
|
||||
const { id } = await this.repository.save(library);
|
||||
return this.repository.findOneByOrFail({ id });
|
||||
}
|
||||
}
|
||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue