Skip to content

Login API#

hopsworks.login #

login(
    host: str | None = None,
    port: int = 443,
    project: str | None = None,
    api_key_value: str | None = None,
    api_key_file: str | None = None,
    hostname_verification: bool = False,
    trust_store_path: str | None = None,
    engine: Literal[
        "spark",
        "python",
        "training",
        "spark-no-metastore",
        "spark-delta",
    ]
    | None = None,
) -> project.Project

Connect to Serverless Hopsworks by calling the hopsworks.login() function with no arguments.

Connect to Serverless
import hopsworks

project = hopsworks.login()

Alternatively, connect to your own Hopsworks installation by specifying the host, port and API key.

Connect to your Hopsworks cluster
import hopsworks

project = hopsworks.login(
    host="my.hopsworks.server",
    port=8181,
    api_key_value="DKN8DndwaAjdf98FFNSxwdVKx",
)

In addition to setting function arguments directly, hopsworks.login() also reads the environment variables: HOPSWORKS_HOST, HOPSWORKS_PORT, HOPSWORKS_PROJECT, HOPSWORKS_API_KEY, HOPSWORKS_HOSTNAME_VERIFICATION, HOPSWORKS_TRUST_STORE_PATH and HOPSWORKS_ENGINE.

The function arguments do however take precedence over the environment variables in case both are set.

PARAMETER DESCRIPTION
host

The hostname of the Hopsworks instance.

TYPE: str | None DEFAULT: None

port

The port on which the Hopsworks instance can be reached.

TYPE: int DEFAULT: 443

project

Name of the project to access. If used inside a Hopsworks environment it always gets the current project. If not provided you will be prompted to enter it.

TYPE: str | None DEFAULT: None

api_key_value

Value of the API Key

TYPE: str | None DEFAULT: None

api_key_file

Path to file wih API Key

TYPE: str | None DEFAULT: None

hostname_verification

Whether to verify Hopsworks' certificate

TYPE: bool DEFAULT: False

trust_store_path

Path on the file system containing the Hopsworks certificates

TYPE: str | None DEFAULT: None

engine

Specifies the engine to use. The default value is None, which automatically selects the engine based on the environment:

  • spark: Used if Spark is available, such as in Hopsworks or Databricks environments.
  • python: Used in local Python environments or AWS SageMaker when Spark is not available.
  • training: Used when only feature store metadata is needed, such as for obtaining training dataset locations and label information during Hopsworks training experiments.
  • spark-no-metastore: Functions like spark but does not rely on the Hive metastore.
  • spark-delta: Minimizes dependencies further by avoiding both Hive metastore and HopsFS.

TYPE: Literal['spark', 'python', 'training', 'spark-no-metastore', 'spark-delta'] | None DEFAULT: None

RETURNS DESCRIPTION
project.Project

The Project object to perform operations on.

RAISES DESCRIPTION
hopsworks.client.exceptions.RestAPIError

If the backend encounters an error when handling the request.

hopsworks.client.exceptions.HopsworksSSLClientError

If SSLError is raised from underlying requests library.

hopsworks.get_current_project #

get_current_project() -> project.Project

Get a reference to the current logged in project.

Example for getting the project reference
import hopsworks

hopsworks.login()

project = hopsworks.get_current_project()
RETURNS DESCRIPTION
project.Project

The Project object to perform operations on.

Specialized APIs#

Once you obtain a project using one of the above methods, you can use the specialized APIs available on the project object. For example: get_feature_store, get_model_registry, get_model_serving, etc.