Написать в техподдержку Позвонить нам
Admin Panel Logout

In this article:

    Load and conditions of comfortable work

    To work comfortably with Big Data clusters, we recommend adhering to the following architecture and the following cluster loading strategy:

    Parameter Airflow Spark Dataflow
    Number of CPU 5-10% more than the calculated peak load
    5-10% more than the calculated peak load
    5-10% more than the calculated peak load
    RAM amount 20% more than the calculated peak volume
    20% more than the calculated peak volume
    20% more than the calculated peak volume
    Hard disk space 10% more than planned storage capacity
    10% more than planned storage capacity
    10% more than planned storage capacity
    Hard disk type SSD, High-IOPS SSD
    SSD, High-IOPS SSD
    SSD, High-IOPS SSD
    Head Nodes one one one
    Number of Worker Nodes one one one

    Was this article helpful?