Skip to content

Get started with NebulaGraph Cloud

This guide covers how to quickly get started with NebulaGraph Cloud.

Note

If you need to log in to the GQL documentation center during any operations, enter the name of your NebulaGraph Cloud account and the initial password Nebula.123. You will be prompted to change the initial password when you log in for the first time.

Prerequisites

A database instance is created. For more information, see Create a database instance.

Step 1. Create graphs

Before you import data, you must create a graph to store the data to be imported. In NebulaGraph Cloud, graphs are created based on graph types, which define the data structure of the graphs. To ensure that your data is imported correctly, the graph type must match the data structure of the source data.

We have provided a demo dataset for you to quickly try out NebulaGraph Cloud. The demo dataset contains a graph type and a graph, which are created based on the demo data. If you want to use it, skip this step and jump to step 2. If you want to create your own graph type and graph, follow the steps below.

  1. Click the name of the database that you created to view the database details.
  2. On the left-side navigation pane of the database details page, click Graph under Data.
  3. In the upper-left corner of the Graph page, follow the instructions introduced in Create a graph type and Create a graph to create a graph type and a graph.

Step 2. Import data

  1. On the left-side navigation pane of the database details page, click Graph under Data.
  2. You can import demo data or your custom data into the database. Follow either of the following methods based on your requirements.

    • Use the demo dataset

      On the Graph page, click Use Demo Dataset to load the demo dataset in NebulaGraph Cloud. For more information, see Demo dataset.

    • Import custom data

      1. Go to the database details page, and click Import on the left-side navigation pane.

      2. On the Import page, click Import Data and follow the instructions introduced in Create import task to import your data.

Step 3. Run queries and explore data

On the left-side navigation pane of the database details page:

  • Click Explore to explore data on canvas. For detailed instructions, see Explore graph.
  • Click Query to perform GQL queries in the console.

    For how to write GQL queries, see GQL guide. (Initial password: Nebula.123)

    For how to navigate around the console, see Data query.

Step 4. Monitor database information and metrics

On the left-side navigation pane of the database details page:

  • To view the basic information and core metrics of the database, click Overview under Basic.

    • On the Basic pane, you can view the basic information about the database, such as the cloud provider, status, and capacity.
    • On the Core Metrics pane, you can select the time range, start time, and end time to view the core database metrics.
  • To view all the database metrics, click Metrics under Monitoring. Then, select the time range, start time, end time, and fresh rate to view the metrics, including:

    • The metrics of NebulaGraph service on the Service pane:

      • Query Per Second (QPS): The number of queries executed per second.
      • Total Sessions: The total number of sessions within a specified period.
      • Query Error Rate: The percentage of failed queries.
      • Query Latency (P95): The 95th percentile latency of all queries sent by the Graph service to the Storage service in the last 5 seconds.
      • DML Latency (P95): The 95th percentile latency of DML queries sent by the Graph service to the Storage service in the last 5 seconds.
      • DQL Latency (P95): The 95th percentile latency of DQL queries sent by the Graph service to the Storage service in the last 5 seconds.
    • The metrics of storage nodes and query nodes on the Infrastructure pane:

      • CPU Utilization: The percentage of CPU used by the database.
      • Memory Utilization: The percentage of memory used by the database.
      • Storage Used: The amount of storage used by the database.
      • Write IOPS: The write input or output operations per second (IOPS) for each Amazon Elastic Block Store (EBS).
      • Read IOPS: The read input or output operations per second (IOPS) for each Amazon Elastic Block Store (EBS).