uwsift.model package

Submodules

uwsift.model.area_definitions_manager module

uwsift.model.area_definitions_manager

Manage area definitions from Satpy/Pyresample.

author:

A.Rettig <alexander.rettig@askvisual.de>

class uwsift.model.area_definitions_manager.AreaDefinitionsManager[source]

Bases: object

Manage Pyresample AreaDefinitions, i.e., check their availability in accessible configuration and provide them based on their display name. The display name is the one to be used in the GUI, it should be less abstract than the area_id and shorter than the description provided in the AreaDefinition object.

classmethod area_def_by_id(id)[source]
classmethod area_def_by_name(name)[source]
classmethod available_area_def_names()[source]
classmethod default_area_def_name()[source]
classmethod init_available_area_defs() None[source]

uwsift.model.catalogue module

class uwsift.model.catalogue.Catalogue[source]

Bases: object

static collect_files_for_data_catalogue(search_path: str, filter_patterns: List[str], filter: dict) Set[str] | None[source]

This method summarize all methods which are needed to create the data catalogue. So it regulates the creation.

static extract_query_parameters(query: dict)[source]

Extract the values of parameters relevant for a catalogue query from the given dictionary query and return them as tuple.

static glob_find_files(patterns: List[str], search_path: str) Set[str][source]

Use given globbing patterns to find matching files in the directory given by search_path.

static group_files_by_group_keys(files: Set[str], group_keys: List[str], reader: str) dict | None[source]

Group given files according to the group_keys configured for the given reader.

A file group contains the name of the reader and the list of those file paths in files which share the same file name parts identified by the group keys.

The returned dictionary associates each file group to it’s group ID. The group ID itself is a sorted tuple of all file paths contained in the group (the reader is not part of that group ID tuple though).

static query_for_satpy_importer_kwargs_and_readers(reader: str, search_path: str, filter_patterns: List[str], group_keys: List[str], constraints: dict, products)[source]

Create a data catalogue with the given parameters and generate importer keywords arguments. If an error occurred, it will be caught and the message will be logged. If no files were found with the given parameters, then the importer keyword arguments won’t be created.

class uwsift.model.catalogue.GlobbingCreator[source]

Bases: object

Create glob patterns from series of constraints.

The GlobbingCreator is responsible for creating globbing patterns suitable for collecting files from a directory with glob.glob().

To do this the GlobbingCreator takes:

  • a MTG-SIFT/satpy/trollsift filter_pattern like "{platform_name:4s}-{channel:_<6s}-{service:3s}-{start_time:%Y%m%d%H%M%}"

  • a constraints dictionary, which is part of a dictionary/query entry of a catalogue configuration associated to a reader.

Thus from the following catalogue configuration:

catalogue:
  - reader: seviri_l1b_hrit
    search_path: /path/to/seviri/data/
    constraints:
      platform_name: MSG4
      channel:
        - ______
        - IR_108
      start_time:
        type: datetime
        Y: 2019
        m: 12
        d: 31
        H: [0, 6, 12, 18] # equivalent to range(0, 24, 6)

it gets the constraints dictionary:

{
    'platform_name' : "MSG4",
    'channel' : ["______", "IR_108"],
    'start_time' : {
        'type' : "datetime",
        'Y' : 2019,
        'm' : 12,
        'd' : 31,
        'H' : [0, 6, 12, 18]
    }
}

Expanding filter_pattern

First the filter pattern is expanded to become:

expanded_filter_pattern = "{platform_name:4s}-{channel:_<6s}-{service:3s}-{start_time_:%Y%}{start_time_m:m%}{start_time_d:d%}{start_time_H:H%}{start_time_M:M%}"

Expanding the constraints

The constraints dictionary is expanded to become a list of dictionaries, where each single dictionary contains only key-value pairs with scalar values (no sequences or mappings). The list of dictionaries contains all combinations which can be created from the given constraints.

Expanding an entry of type: datetime

The given start_time configuration represents several actual datetime values because of the sequence given for H. These dt_XX variables are only for abbreviation to be used later:

dt_00 = datetime(2019, 12, 31, hour=0,  tz=timezone.utc)
dt_06 = datetime(2019, 12, 31, hour=6,  tz=timezone.utc)
dt_12 = datetime(2019, 12, 31, hour=12, tz=timezone.utc)
dt_18 = datetime(2019, 12, 31, hour=18, tz=timezone.utc)

Having this, a list of expanded_datetime dictionaries is generated:

[{'start_time_Y': dt_00, 'start_time_m': dt_00, 'start_time_d': dt_00, 'start_time_H': dt_00},
 {'start_time_Y': dt_06, 'start_time_m': dt_06, 'start_time_d': dt_06, 'start_time_H': dt_06},
 {'start_time_Y': dt_12, 'start_time_m': dt_12, 'start_time_d': dt_12, 'start_time_H': dt_12},
 {'start_time_Y': dt_18, 'start_time_m': dt_18, 'start_time_d': dt_18, 'start_time_H': dt_18}]

Note, that there are new keys generated, one for each of the datetime format code directives (%Y, %m, …, see datetime / strftime() and strptime() Behavior) which are given as keys (without the percent sign prefix) in the original constraints.

CAUTION: Expansion of sequences for type: datetime constraints is not implemented yet, entries for the datetime format directives must be single integers for now!

Result of expansion

For the given example this expanded_constraints list is:

[{'platform_name': 'MSG4', 'channel': '______', 'start_time_Y': dt_00, 'start_time_m': dt_00, 'start_time_d': dt_00, 'start_time_H': dt_00},
 {'platform_name': 'MSG4', 'channel': '______', 'start_time_Y': dt_06, 'start_time_m': dt_06, 'start_time_d': dt_06, 'start_time_H': dt_06},
 {'platform_name': 'MSG4', 'channel': '______', 'start_time_Y': dt_12, 'start_time_m': dt_12, 'start_time_d': dt_12, 'start_time_H': dt_12},
 {'platform_name': 'MSG4', 'channel': '______', 'start_time_Y': dt_18, 'start_time_m': dt_18, 'start_time_d': dt_18, 'start_time_H': dt_18},
 {'platform_name': 'MSG4', 'channel': 'IR_108', 'start_time_Y': dt_00, 'start_time_m': dt_00, 'start_time_d': dt_00, 'start_time_H': dt_00},
 {'platform_name': 'MSG4', 'channel': 'IR_108', 'start_time_Y': dt_06, 'start_time_m': dt_06, 'start_time_d': dt_06, 'start_time_H': dt_06},
 {'platform_name': 'MSG4', 'channel': 'IR_108', 'start_time_Y': dt_12, 'start_time_m': dt_12, 'start_time_d': dt_12, 'start_time_H': dt_12},
 {'platform_name': 'MSG4', 'channel': 'IR_108', 'start_time_Y': dt_18, 'start_time_m': dt_18, 'start_time_d': dt_18, 'start_time_H': dt_18}]

Expanding an entry of type: relative_datetime

To match the replacement field {{start_time:%Y%m%d%H%M%} of the file_pattern relative to the current time a different configuration must be given for the constraint start_time:

start_time:
    type: relative_datetime
    d: [0, -1] # equivalent to range(-2)

From that configuration the following list of expanded_datetime dictionaries is generated.:

[{'start_time_Y': dt_r0, 'start_time_m': dt_r0, 'start_time_d': dt_r0},
 {'start_time_Y': dt_r1, 'start_time_m': dt_r1, 'start_time_d': dt_r1}]

where the dt_XX variables (used for abbreviation here again) are

now_utc = datetime.now(timezone.utc)
dt_r0 = now_utc + relativedelta(days=0)
dt_r1 = now_utc + relativedelta(days=-1)

which means when assuming it is 2020-10-01 12:45:06 UTC now:

dt_r0 == datetime.fromisoformat("2020-10-01T12:45:06+00:00")
dt_r1 == datetime.fromisoformat("2020-09-30T12:45:06+00:00")

Note that new keys are generated analogously to the type: datetime case. For now which of these keys are generated is computed from the one given key by taking all from the list ['Y', 'm', 'd', 'H', 'M'] until before the given one.

This approach is not suitable for all possible datetime-like replacement fields, notably not for the datetime filename parts of GOES-R data which use day of the year as a zero-padded decimal number (directive %j) or if the year is represented only with two digits (directive %y) for example. These cases are left for future improvements.

Putting everything together

The wanted globbing patterns are generated by using trollsift.parser.globify() for the file_pattern with each of the dictionaries in expanded_constraints.

For the type: datetime example case this yields:

MSG4-______-???-2019123100??
MSG4-______-???-2019123106??
MSG4-______-???-2019123112??
MSG4-______-???-2019123118??
MSG4-IR_108-???-2019123100??
MSG4-IR_108-???-2019123106??
MSG4-IR_108-???-2019123112??
MSG4-IR_108-???-2019123118??

and for the type: relative_datetime case:

MSG4-______-???-20200930????
MSG4-______-???-20201001????
MSG4-IR_108-???-20200930????
MSG4-IR_108-???-20201001????

General Note

The current implementation is not robust against bad Catalogue configuration as it doesn’t profoundly check for errors in it. It should work for correct ones but fail stupidly even without giving any helpful feedback for broken ones, thus the writer of the configuration is asked to be gracious. Resist from using sequence entries for too many replacement fields since this would lead to combinatorial explosion (which is not retained).

Actually the Catalogue defines kind of a query language which to implement a complete validation for would require considerable effort.

static construct_globbing_patterns(filter_patterns: List[str], constraints: dict) List[str][source]

Construct a list of globbing patterns from the given filter_patterns with the given constraints applied.

Returns: a list of strings, each usable as parameter for glob.glob()

static init_now()[source]

Initialize the GlobbingCreator so that “now” at the time of the call is used as reference for “recent_datetime”, i.e., datetime constraints relative to current time.

now_utc: datetime = datetime.datetime(2024, 1, 1, 20, 50, 59, 375337, tzinfo=datetime.timezone.utc)
class uwsift.model.catalogue.SceneManager[source]

Bases: object

The (future) purpose of this class is to keep information about already seen Satpy Scenes.

Satpy Scenes are in a way collections of files as well as the information which products can be “made” from them.

TODO: This purpose may overlap with similar task elsewhere implemented in

SIFT already, check this

TODO Adopt the function create_scenes()…

get_data_ids_for_products(all_available_data_ids, products) List[DataID][source]

Look up DataIDs of products in all_available_data_ids

TODO: Notify about products for which no DataID was found

uwsift.model.composite_recipes module

Composite recipe utilities and classes.

Composites in SIFT can be generated in two main ways:

  • Algebraic layers: Combine one or more layers in to a new single band

    layer by performing arithmetic between the input layers. These composites are typically calculated once, can’t be modified, and are cached on disk.

  • RGB layers: Combine 1-3 layers in to a red, green, blue channel image

    to produce a colorful RGB image. These composites are typically generated on-the-fly by the GPU by providing all inputs as textures. These composites are typically not cached on disk.

This module deals with the on-the-fly type composites like RGB layers. Since these composites are not cached, the recipes to make them must be stored so they can be recreated in the future.

class uwsift.model.composite_recipes.AlgebraicRecipe(name: str, input_layer_ids: list = <factory>, read_only: bool = False, operation_kind: str = <factory>, operation_formula: str = <factory>)[source]

Bases: Recipe

classmethod from_algebraic(name, x=None, y=None, z=None, operation_kind=None, operation_formula=None)[source]
classmethod kind()[source]
property modified: bool
operation_formula: str
operation_kind: str
class uwsift.model.composite_recipes.CompositeRecipe(name: str, input_layer_ids: list = <factory>, read_only: bool = False, color_limits: list = <factory>, gammas: list = <factory>)[source]

Bases: Recipe

Recipe class responsible for storing the combination of 1-3 layers as red, green and blue channel image to produce a colorful RGB image. These composites are typically generated on-the-fly by the GPU by providing all inputs as textures.

Do not instantiate this class directly but use CompositeRecipe.from_rgb().

property blue

Get the control parameters for the blue channel as a dict.

color_limits: list
classmethod from_rgb(name, r=None, g=None, b=None, color_limits=None, gammas=None)[source]
gammas: list
property green

Get the control parameters for the green channel as a dict.

classmethod kind()[source]
property red

Get the control parameters for the red channel as a dict.

set_default_color_limits(r=None, g=None, b=None)[source]

Set color limits based on dependency limits

class uwsift.model.composite_recipes.Recipe(name: str, input_layer_ids: list = <factory>, read_only: bool = False)[source]

Bases: object

Recipe base class. All recipes belong to a Layer and store information which input Layers provide the image data that is used to generate the images of their Layer.

copy(new_name)[source]

Get a copy of this recipe with a new name

property id
input_layer_ids: list
abstract classmethod kind()[source]
name: str
read_only: bool = False
to_dict()[source]

Convert to YAML-compatible dict.

class uwsift.model.composite_recipes.RecipeManager(parent=None, config_dir=None)[source]

Bases: QObject

create_algebraic_recipe(layers)[source]
create_rgb_recipe(layers)[source]

Create an RGB recipe and triggers a signal that a rgb composite layer can be created.

Parameters:

layers – The layers which will be used to create a rgb composite

load_available_recipes()[source]

Load recipes from stored config files

open_recipe(pathname)[source]

Open a recipe file and return a CompositeRecipe object.

Parameters:

pathname (str) – Full path to a recipe YAML document

Raises:

ValueError – if any error occurs reading and loading the recipe

remove_layer_as_recipe_input(layer_uuid: UUID)[source]

Remove a layer from all recipes in which it is used as input layer.

Must be called before the layer given by the layer_uuid can be removed from the system.

Parameters:

layer_uuid – UUID of the layer to be removed from all recipes

save_recipe(recipe, filename=None, overwrite=False)[source]
update_algebraic_recipe_input_layers(recipe: AlgebraicRecipe, channel: str, layer_uuid: UUID | None)[source]
update_algebraic_recipe_operation_formula(recipe: AlgebraicRecipe, operation_formula: str)[source]
update_algebraic_recipe_operation_kind(recipe: AlgebraicRecipe, operation_kind: str)[source]
update_recipe_name(recipe: CompositeRecipe, name: str)[source]
update_rgb_recipe_color_limits(recipe: CompositeRecipe, channel: str, clim: Tuple[float, float])[source]

Update the color limit value of the given channel

update_rgb_recipe_gammas(recipe: CompositeRecipe, channel: str, gamma: float)[source]

Update the gamma value of the given channel

update_rgb_recipe_input_layers(recipe: CompositeRecipe, channel: str, layer_uuid: UUID | None, clims: Tuple[float | None, float | None], gamma: float)[source]

Update the input layers in the recipe for a specific channel. With this change, the color limits and the gamma value of this specific channel has to be changed, too.

uwsift.model.document module

uwsift.model.document

The document is an interface to further process some user interactions and delegate the import of new content to the workspace. It also contains all metadata information of all loaded records.

The document handles the following tasks:
  • import new files

  • instruct the workspace to import new content

  • create a Presentation using metadata information

  • manage the currently active area definition used to present the data

  • manage ser color maps

The communication between the document and other parts of the application are done with signal/slot connections.

Document has zero or more Colormaps, determining how they’re presented

The document does not own data (content). It only owns metadata (info).

All entities in the Document have a UUID that is their identity throughout their lifecycle, and is often used as shorthand between subsystems. Document rarely deals directly with content.

author:

R.K.Garcia <rayg@ssec.wisc.edu> and others

copyright:

2015 by University of Wisconsin Regents, see AUTHORS for more details

license:

GPLv3, see LICENSE for more details

class uwsift.model.document.Document(workspace: BaseWorkspace, queue: TaskQueue, config_dir='/home/docs/.config/SIFT/settings', **kwargs)[source]

Bases: QObject

Storage for dataset info and user information.

This is the low-level “internal” interface that acts as a signaling hub. Direct access to the document is being deprecated. Most direct access patterns should be migrated to using a contextual view of the document, in order to reduce abstraction leakage and permit the document storage to evolve.

activate_product_uuid_as_new_dataset(uuid: UUID, insert_before=0, **importer_kwargs)[source]
area_definition(area_definition_name=None)[source]
change_projection(area_def_name=None)[source]
change_projection_index(idx)[source]
current_projection_index()[source]
find_colormap(colormap)[source]
get_uuids()[source]
import_files(paths, insert_before=0, **importer_kwargs) Generator[dict, None, None][source]

Load product metadata and content from provided file paths.

Parameters:
  • paths – paths to open

  • insert_before – where to insert them in layer manager

Returns:

remove_dataset_info(uuid: UUID)[source]

Remove the info of a dataset because it is no longer needed

Parameters:

uuid – UUID of the dataset which is removed

remove_user_colormap(name)[source]
sort_product_uuids(uuids: Iterable[UUID]) List[UUID][source]
update_user_colormap(colormap, name)[source]

uwsift.model.layer_item module

class uwsift.model.layer_item.LayerItem(model, info: frozendict, presentation: Presentation, grouping_key=None, recipe: Recipe | None = None, parent=None)[source]

Bases: object

add_algebraic_dataset(presentation: Presentation | None, info: frozendict, sched_time: datetime, input_datasets_uuids: List[UUID])[source]
add_dataset(info: frozendict, presentation: Presentation | None = None) ProductDataset | None[source]

Add ProductDataset to Layer. If a Presentation is passed it overwrites the Presentation of the layer for the given dataset.

Parameters:
  • info – Mapping providing metadata for ProductDataset instantiation

  • presentation – Mapping with visualisation configuration for the dataset to add.

Returns:

Newly created ProductDataset if a dataset with the same uuid or for the same scheduling does not already exist in the layer

add_multichannel_dataset(presentation: Presentation | None, sched_time: datetime, input_datasets_uuids: List[UUID], input_datasets_infos: List[frozendict | None]) ProductDataset | None[source]

Add multichannel ProductDataset to Layer. If a Presentation is passed it overwrites the Presentation of the layer for the given dataset.

Parameters:
  • presentation – Mapping with visualisation configuration for the dataset to add.

  • sched_time

  • input_datasets_uuids

  • input_datasets_infos – List of mapping providing metadata for ProductDatasets

Returns:

Newly created multichannel ProductDataset if a dataset with the same scheduled time does not already exist in the layer

data(column: int)[source]
describe_timeline()[source]

Get a string containing the layer’s descriptor and the scheduling times in its timeline with the according dataset UUIDs.

property descriptor
determine_initial_clims()[source]

Get a min/max value pair to be used as limits for colour mapping.

Except for Algebraics composites the preferred candidate range is the valid range stored in the layer metadata, which has been determined elsewhere. If that is missing, it is tried to determine a range from the first dataset of the layer.

For Algebraics the range is the min/max value pair calculated from all datasets of the layer.

If all of the above fail, return invalid colour limits (+inf, -inf) (!).

property dynamic
static extract_layer_info(info: frozendict) frozendict[source]
get_active_product_datasets() List[ProductDataset][source]
get_actual_range_from_first_active_dataset() Tuple[source]

Returns the calculated actual range value of the first active dataset

get_actual_range_from_layer() Tuple[source]

Calculate on the fly the actual range of the layer.

The ‘actual range’ of a layer is the union of the ‘actual ranges’ of all datasets belonging to that layer.

get_dataset_by_uuid(uuid: UUID) ProductDataset | None[source]
get_datasets_uuids() List[UUID][source]
get_first_active_product_dataset() ProductDataset | None[source]
has_in_timeline(dataset_uuid) bool[source]
property kind
property name
property opacity
property order
property presentation
property probe_value
remove_dataset(sched_time)[source]

Remove a dataset for given datetime from layer

Gracefully ignores if no dataset with the given sched_time exists in the layer.

replace_recipe_layer_info(info: frozendict)[source]

Replace the info of a recipe layer with the given one.

This will raise a ValueError if this layer is not a recipe layer.

property short_descriptor

Return the short display descriptor of a layer

property timeline
update_invariable_display_data() None[source]
property valid_range
property visible

uwsift.model.layer_model module

class uwsift.model.layer_model.LayerModel(document: Document, parent=None, policy=None)[source]

Bases: QAbstractItemModel

add_dataset(info: frozendict, presentation: Presentation) None[source]

Slot specifically to fill model from Document’s activate_product_uuid_as_new_layer. For every loaded dataset Document emits didAddDataset signal which must be connected to this method.

Parameters:
  • info – Dictionary of dataset metadata information.

  • presentation – Presentation to be set for layer, when a new one has to be created to hold the dataset, ignored otherwise.

change_color_limits_for_layer(uuid: UUID, color_limits: object)[source]
change_colormap_for_layer(uuid: UUID, colormap: object)[source]
change_gamma_for_layer(uuid: UUID, gamma: float | List[float])[source]
columnCount(self, parent: QModelIndex = QModelIndex()) int[source]
create_algebraic_composite_layer(recipe: AlgebraicRecipe)[source]

Creates a layer which has an algebraic composite recipe

Parameters:

recipe – the algebraic composite recipe which the created layer gets as recipe

static create_reasonable_algebraic_composite_default()[source]

Creates a reasonable default layer list for algebraic composites :return: the reasonable default layer list

static create_reasonable_rgb_composite_default()[source]

Creates a reasonable default layer list for rgb composites :return: the reasonable default layer list

create_rgb_composite_layer(recipe: CompositeRecipe)[source]

Creates a layer which has a rgb composite recipe.

Parameters:

recipe – the rgb composite recipe which the created layer gets as recipe

data(self, index: QModelIndex, role: int = Qt.ItemDataRole.DisplayRole) Any[source]
dropMimeData(self, data: Optional[QMimeData], action: Qt.DropAction, row: int, column: int, parent: QModelIndex) bool[source]
flags(self, index: QModelIndex) Qt.ItemFlags[source]
get_dataset_by_uuid(dataset_uuid: UUID) ProductDataset | None[source]

Find a dataset given by its uuid in the layer model and return it, None if it is not in the model.

Parameters:

dataset_uuid

Returns:

dataset if found, None else

get_dataset_presentation_by_uuid(uuid)[source]

Get the presentation of the dataset with the given UUID. If the dataset has no presentation then the presentation of the layer which own this dataset is returned.

Parameters:

uuid – UUID of the dataset which presentation should be returned

Returns:

either the presentation of the dataset or of the layer, if the dataset has no presentation

get_dynamic_layers()[source]
get_input_layers_info(recipe_layer: LayerItem) List[frozendict | None][source]
get_layer_by_uuid(uuid: UUID) LayerItem | None[source]
get_layers_by_uuids(layer_uuids: List[UUID])[source]

Get layers which have the given identifiers as an attribute.

Parameters:

layer_uuids – identifiers which are used to search the wanted layers

Returns:

the searched layers

get_probeable_layers() List[LayerItem][source]

Get LayerItems which may contain data suitable for probing operations.

Currently only single channel raster data can be point or region probed, thus the layer must be one capable of carrying datasets of kind IMAGE or COMPOSITE.

hasChildren(parent=None) bool[source]

For now the Layer model does not support layer hierarchies (group layers) thus only the root index can have children.

Parameters:

parent – model index to query

Returns:

true if parent is the root index and has at least one row and column

headerData(self, section: int, orientation: Qt.Orientation, role: int = Qt.ItemDataRole.DisplayRole) Any[source]
index(self, row: int, column: int, parent: QModelIndex = QModelIndex()) QModelIndex[source]
init_system_layers()[source]

Create layers whose existence is controlled by the system, not by the user.

Currently two system layers are set up, one for a latitude/longitude grid, the second for political borders.

mimeData(self, indexes: Iterable[QModelIndex]) Optional[QMimeData][source]
mimeTypes(self) List[str][source]
on_didMatchTimes(t_matched_dict: dict)[source]
on_point_probe_set(probe_name, state, xy_pos, uuids=None)[source]

user has clicked on a point probe; determine relative and absolute values for all document image layers

order(layer: LayerItem) int[source]

Method to return the order of a specific layer within the model. Determined by its index in the model.

Parameters:

layer – Layer whose order is queried.

Returns:

Integer representing the order of queried layer.

parent(self, child: QModelIndex) QModelIndex[source]
parent(self) Optional[QObject]
remove_datasets_from_all_layers(dataset_uuids)[source]

This method can be used if only the datasets and not the whole layer should be removed and if datasets from different layers should be removed (or the caller does not know to which layer the datasets belong). The dataset can be only removed if the UUID in the given list belongs to an existing dataset.

Parameters:

dataset_uuids – List of UUIDs from datasets which going to be removed

remove_layers(indices: List[QModelIndex])[source]

Iterate the given indices, and if the layer is not a system layer at a given index, it can be deleted.

But before a layer can be finally deleted, it must be empty. To do this, the layer must no longer have any ProductDatasets and other things associated with them, such as visual nodes. The layer must be removed as an input layer for all derived layers that use the layer to be deleted. Finally, the corresponding Scene Graph node of the layer must also be removed and the timeline must also be updated.

Parameters:

indices – a list of QModelIndex indices which should be deleted and which should exist in the LayerModel

rowCount(self, parent: QModelIndex = QModelIndex()) int[source]
setData(self, index: QModelIndex, value: Any, role: int = Qt.ItemDataRole.EditRole) bool[source]
start_algebraic_composite_creation(layers=None)[source]

starts creation of an algebraic composite recipe.

Parameters:

layers – The layers which will be used to create a rgb composite. - Layer at the index 0 will be used for the x component of the algebraic. - Layer at the index 1 will be used for the y component of the algebraic. - Layer at the index 2 will be used for the z component of the algebraic.

start_rgb_composite_creation(layers=None)[source]

starts creation of rgb composite recipe.

Parameters:

layers – The layers which will be used to create a rgb composite. - Layer at the index 0 will be used for the red component of the rgb. - Layer at the index 1 will be used for the green component of the rgb. - Layer at the index 2 will be used for the blue component of the rgb.

supportedDropActions(self) Qt.DropActions[source]
toggle_layers_visibility(indexes: List[QModelIndex])[source]
update_recipe_layer_name(recipe: Recipe)[source]
update_recipe_layer_timeline(recipe: Recipe)[source]

Update the list of sched_times and associated data for which the recipe layer can present data.

A recipe layer aka derived layer has an entry for any given sched_time, if and only if all layers directly or indirectly referenced by its recipe have data for that sched_time. The method updates the timeline for (the layer of) the given recipe by calculating it as intersection of the timelines of all contributing layers. By comparing this common timeline with the current timeline of the recipe layer it has to be determined, for which sched_times derived datasets need to be removed, updated or added before the corresponding actions are performed.

MAINTENANCE: For now the removal of recipe layer datasets is the same regardless of whether the recipe layer has an algebraic or composite recipe. For the other two steps of the update process - updating and adding derived datasets - there is a different handling depending on the type of recipe.

If the given recipe (layer) can be used as input for other recipe layers (currently only for algebraics), then the dependent recipe layers must also be and is updated by calling this method with their recipes recursively.

At the end of each update iteration, the information of the updated recipe layer is replaced. The clims of the algebraic layer are also set correctly if they only have an invalid clims value. If an algebraic layer is empty again then it will get an invalid clims value.

ATTENTION: There must be no cyclic dependency defined by recipes (e.g. an algebraic layer n which uses the algebraic layer m as input layer, which in turn - directly or indirectly - again uses the layer n as input layer), otherwise the depicted recursion will not terminate! This case is not caught!

Parameters:

recipe – Recipe of the layer whose timeline is to be updated

update_rgb_layer_color_limits(recipe: CompositeRecipe)[source]
update_rgb_layer_gamma(recipe: CompositeRecipe)[source]
update_user_colormap_for_layers(colormap)[source]

Forward changes to a custom colormap to layers that use it

This slot must be called, when a user-created color map has been edited. The changes must be propagated to the layers that use that color map so that they can update their scene graph nodes accordingly.

Parameters:

colormap – Name of the colormap which has an update

class uwsift.model.layer_model.ProductFamilyKeyMappingPolicy(model: LayerModel)[source]

Bases: object

get_existing_layer_for_dataset(info: frozendict)[source]

Returns layer within an instance of LayerModel according to a match between the grouping_key calculated from the given dataset metadata information and the grouping_key s within LayerModel’s layers collection.

Parameters:

info – Dataset metadata information

Returns:

tuple with LayerItem with its grouping_key matching that of the passed dataset metadata information, if there is already one in the LayerModel, None otherwise. Second element of the tuple is the grouping key generated by the policy. You must use that key when creating a new layer for the dataset for the given info to make the policy work.

static get_grouping_key(info)[source]

uwsift.model.product_dataset module

class uwsift.model.product_dataset.ProductDataset(layer_uuid: UUID, info: frozendict, presentation: Presentation | None, input_datasets_uuids: List[UUID] | None = None)[source]

Bases: object

classmethod get_algebraic_dataset(layer_uuid: UUID, info: frozendict, presentation: Presentation | None, input_datasets_uuids: List[UUID])[source]
classmethod get_rgb_multichannel_product_dataset(layer_uuid: UUID, presentation: Presentation | None, input_datasets_uuids: List[UUID], kind: Kind, scheduled_time, input_datasets_infos: List[frozendict | None]) ProductDataset | None[source]
property kind
update_multichannel_dataset_info(input_datasets_infos)[source]
property uuid

uwsift.model.shapes module

shapes.py

PURPOSE Shape datasets which can be represented in the workspace as data content masks

REFERENCES

REQUIRES

author:

R.K.Garcia <rayg@ssec.wisc.edu>

copyright:

2014 by University of Wisconsin Regents, see AUTHORS for more details

license:

GPLv3, see LICENSE for more details

uwsift.model.shapes.content_within_shape(content: ndarray, trans: rasterio.Affine, shape: LinearRing)[source]
Parameters:
  • content – data being displayed on the screen

  • trans – affine transform between content array indices and screen coordinates

  • shape – LinearRing in screen coordinates (e.g. mercator meters)

Returns:

masked_content:masked_array, (y_index_offset:int, x_index_offset:int) containing minified masked content array

uwsift.model.time_manager module

class uwsift.model.time_manager.TimeManager(animation_speed: float, matching_policy: ~typing.Callable = <function find_nearest_past>)[source]

Bases: QObject

Actions upon tick event:
  • Time Manager gets t_sim from t2t_translator

  • forwards it to Display Layers

  • Display Layers each give their timeline and t_sim to TimeMatcher

  • TimeMatcher returns t_matched for every non-driving layer timeline

  • each Display Layer requests the image corresponding to the matched timestamp from collection

  • Image is displayed

connect_to_model(layer_model: LayerModel)[source]
create_formatted_t_sim()[source]

Used for updating the animation label during animation.

get_current_timebase_current_dataset()[source]
get_current_timebase_current_dataset_uuid() UUID | None[source]
get_current_timebase_dataset_count()[source]
get_current_timebase_dataset_uuids() List[UUID][source]
get_current_timebase_datasets() List[ProductDataset][source]
get_current_timebase_timeline()[source]
get_current_timebase_timeline_index()[source]
jump(index)[source]
on_timebase_change(index)[source]

Slot to trigger timebase change by looking up data layer at specified index. Then calls time transformer to execute change of the timebase.

Parameters:

index – DataLayer index obtained by either: clicking an item in the ComboBox or by clicking a convenience function in the convenience function popup menu

property qml_backend: QmlBackend
step(backwards: bool = False)[source]

Advance in time, either forwards or backwards, by one time step.

Parameters:

backwards – Flag which sets advancement either to forwards or backwards.

sync_to_time_transformer()[source]
tick(event)[source]

Proxy function for TimeManager.step().

TimeManager cannot directly receive a signal from the animation timer signal because the latter passes an event that step() cannot deal with. Thus connect to this method to actually trigger step().

Parameters:

event – Event passed by AnimationController.animation_timer on expiry, simply dropped.

tick_qml_state(t_sim, timeline_idx)[source]
update_qml_layer_model()[source]

Slot connected to didUpdateCollection signal, responsible for managing the data layer combo box contents

update_qml_timeline(layer: LayerItem)[source]

Slot that updates and refreshes QML timeline state using a DataLayer.

DataLayer is either:
  1. a driving layer or some other form of high priority data layer

  2. a ‘synthetic’ data layer, only created to reflect the best fitting

    timeline/layer info for the current policy -> this may be policy-dependant

# TODO(mk): the policy should not be responsible for UI, another policy or an object

that ingests a policy and handles UI based on that?

Module contents

__init__.py ~~~

PURPOSE Model contains all the irreplaceable user input and state. It represents the metadata and planning of what’s to be shown on the screen and what the user can do with it. Model uses a Workspace to help it work with large quantities of data.

REFERENCES

REQUIRES

author:

R.K.Garcia <rayg@ssec.wisc.edu>

copyright:

2014 by University of Wisconsin Regents, see AUTHORS for more details

license:

GPLv3, see LICENSE for more details