All Products
Search
Document Center

DataWorks:Best practices for calling API operations to develop, commit, and run tasks

Last Updated:Mar 29, 2024

DataWorks provides various API operations. You can call the API operations to manage your business based on your requirements. This topic describes how to call DataWorks API operations to quickly develop, commit, and run tasks.

Background information

This topic describes the DataWorks API operations that can be called in the following business scenarios. Before you perform the steps that are described in this topic, we recommend that you understand the core capabilities and concepts related to the business scenarios.

  • Query and manage workspaces, workflows, node folders, and nodes, and commit and deploy nodes. DataStudio API operations, such as CreateBusiness and ListBusiness, are used.

  • Perform smoke testing and view run logs. Operation Center API operations, such as RunSmokeTest, are used.

The following sections describe the procedure and provide the core parts of sample code for the procedure.

  1. Backend code development

  2. Frontend code development

  3. Deploy and run the code on your on-premises machine

If you want to view or download the complete sample source code, see Reference: Download complete sample source code in this topic.

Backend code development

Step 1: Develop the ProjectService class to query workspaces

You need to develop the ProjectService class. The class defines the ListProjects function that is used to call the ListProjects operation to query workspaces. After you call the operation, the workspaces that can be used for frontend development are returned.

package com.aliyun.dataworks.services;


import com.aliyuncs.dataworks_public.model.v20200518.ListProjectsRequest;
import com.aliyuncs.dataworks_public.model.v20200518.ListProjectsResponse;
import com.aliyuncs.exceptions.ClientException;
import com.aliyuncs.exceptions.ServerException;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;


@Service
public class ProjectService {


    @Autowired
    private DataWorksOpenApiClient dataWorksOpenApiClient;


    /**
   	 * @param pageNumber
     * @param pageSize
     * @return
     */
    public ListProjectsResponse.PageResult listProjects(Integer pageNumber, Integer pageSize) {
        try {
            ListProjectsRequest listProjectsRequest = new ListProjectsRequest();
            listProjectsRequest.setPageNumber(pageNumber);
            listProjectsRequest.setPageSize(pageSize);
            ListProjectsResponse listProjectsResponse = dataWorksOpenApiClient.createClient().getAcsResponse(listProjectsRequest);
            System.out.println(listProjectsResponse.getRequestId());
            return listProjectsResponse.getPageResult();
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return null;
    }
}

Step 2: Develop the BusinessService class to process workflows

You need to develop the BusinessService class. The class defines the following functions:

  • The CreateBusiness function that can be used to call the CreateBusiness operation to create a workflow.

  • The ListBusiness function that can be used to call the ListBusiness operation to query workflows.

The functions are used during frontend development to create a sample workflow and query workflows.

Note

You can also develop the FolderService class to display a directory tree. The directory tree consists of workflows, node folders, and nodes. The following sample code provides an example of the core process. For the FolderService functions that are related to node folders, see the complete sample code that is provided in GitHub.

package com.aliyun.dataworks.services;


import com.aliyun.dataworks.dto.CreateBusinessDTO;
import com.aliyun.dataworks.dto.DeleteBusinessDTO;
import com.aliyun.dataworks.dto.ListBusinessesDTO;
import com.aliyun.dataworks.dto.UpdateBusinessDTO;
import com.aliyuncs.dataworks_public.model.v20200518.*;
import com.aliyuncs.exceptions.ClientException;
import com.aliyuncs.exceptions.ServerException;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import org.springframework.util.CollectionUtils;


import java.util.List;


/**
 * @author dataworks demo
 */
@Service
public class BusinessService {


    @Autowired
    private DataWorksOpenApiClient dataWorksOpenApiClient;


    /**
		 * create a business
     *
     * @param createBusinessDTO
     */
    public Long createBusiness(CreateBusinessDTO createBusinessDTO) {
        try {
            CreateBusinessRequest createBusinessRequest = new CreateBusinessRequest();
            // The name of the workflow.
            createBusinessRequest.setBusinessName(createBusinessDTO.getBusinessName());
            createBusinessRequest.setDescription(createBusinessDTO.getDescription());
            createBusinessRequest.setOwner(createBusinessDTO.getOwner());
            createBusinessRequest.setProjectId(createBusinessDTO.getProjectId());
            // The module to which the workflow belongs. Valid values: NORMAL and MANUAL_BIZ.
            createBusinessRequest.setUseType(createBusinessDTO.getUseType());
            CreateBusinessResponse createBusinessResponse = dataWorksOpenApiClient.createClient().getAcsResponse(createBusinessRequest);
            System.out.println("create business requestId:" + createBusinessResponse.getRequestId());
            System.out.println("create business successful,the businessId:" + createBusinessResponse.getBusinessId());
            return createBusinessResponse.getBusinessId();
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return null;
    }


    /**
     * @param listBusinessesDTO
     * @return
     */
    public List listBusiness(ListBusinessesDTO listBusinessesDTO) {
        try {
            ListBusinessRequest listBusinessRequest = new ListBusinessRequest();
            listBusinessRequest.setKeyword(listBusinessesDTO.getKeyword());
            listBusinessRequest.setPageNumber(listBusinessesDTO.getPageNumber() < 1 ? 1 : listBusinessesDTO.getPageNumber());
            listBusinessRequest.setPageSize(listBusinessesDTO.getPageSize() < 10 ? 10 : listBusinessesDTO.getPageSize());
            listBusinessRequest.setProjectId(listBusinessesDTO.getProjectId());
            ListBusinessResponse listBusinessResponse = dataWorksOpenApiClient.createClient().getAcsResponse(listBusinessRequest);
            System.out.println("list business requestId:" + listBusinessResponse.getRequestId());
            ListBusinessResponse.Data data = listBusinessResponse.getData();
            System.out.println("total count:" + data.getTotalCount());
            if (!CollectionUtils.isEmpty(data.getBusiness())) {
                for (ListBusinessResponse.Data.BusinessItem businessItem : data.getBusiness()) {
                    System.out.println(businessItem.getBusinessId() + "," + businessItem.getBusinessName() + "," + businessItem.getUseType());
                }
            }
            return data.getBusiness();
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return null;
    }


    /**
     * update a business
     * @param updateBusinessDTO
     * @return
     */
    public Boolean updateBusiness(UpdateBusinessDTO updateBusinessDTO) {
        try {
            UpdateBusinessRequest updateBusinessRequest = new UpdateBusinessRequest();
            updateBusinessRequest.setBusinessId(updateBusinessDTO.getBusinessId());
            updateBusinessRequest.setBusinessName(updateBusinessDTO.getBusinessName());
            updateBusinessRequest.setDescription(updateBusinessDTO.getDescription());
            updateBusinessRequest.setOwner(updateBusinessDTO.getOwner());
            updateBusinessRequest.setProjectId(updateBusinessDTO.getProjectId());
            UpdateBusinessResponse updateBusinessResponse = dataWorksOpenApiClient.createClient().getAcsResponse(updateBusinessRequest);
            System.out.println(updateBusinessResponse.getRequestId());
            System.out.println(updateBusinessResponse.getSuccess());
            return updateBusinessResponse.getSuccess();
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return false;
    }


    /**
     * delete a business
     * @param deleteBusinessDTO
     */
    public boolean deleteBusiness(DeleteBusinessDTO deleteBusinessDTO) {
        try {
            DeleteBusinessRequest deleteBusinessRequest = new DeleteBusinessRequest();
            deleteBusinessRequest.setBusinessId(deleteBusinessDTO.getBusinessId());
            deleteBusinessRequest.setProjectId(deleteBusinessDTO.getProjectId());
            DeleteBusinessResponse deleteBusinessResponse = dataWorksOpenApiClient.createClient().getAcsResponse(deleteBusinessRequest);
            System.out.println("delete business:" + deleteBusinessResponse.getRequestId());
            System.out.println("delete business" + deleteBusinessResponse.getSuccess());
            return deleteBusinessResponse.getSuccess();
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return false;
    }



}

Step 3: Develop the FileService class to process files

You need to develop the FileService class. The class defines the following functions that can be used to process files:

  • The listFile function that can be used to call the ListFiles operation to query files.

  • The createFile function that can be used to call the CreateFile operation to create files.

  • The updateFile function that can be used to call the UpdateFile operation to update files.

  • The deployFile function that can be used to call the DeployFile operation to deploy files.

  • The runSmokeTest function that can be used to call the RunSmokeTest operation to perform smoke testing.

  • The getInstanceLog function that can be used to call the GetInstanceLog operation to query the logs of an instance.

The functions can be used to create a file, query files, save a file, and commit and run a file.

package com.aliyun.dataworks.services;


import com.aliyun.dataworks.dto.*;
import com.aliyuncs.dataworks_public.model.v20200518.*;
import com.aliyuncs.exceptions.ClientException;
import com.aliyuncs.exceptions.ServerException;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import org.springframework.util.CollectionUtils;


import java.util.List;


/**
 * the ide files manager service
 *
 * @author dataworks demo
 */
@Service
public class FileService {


    @Autowired
    private DataWorksOpenApiClient dataWorksOpenApiClient;


    public static final int CYCLE_NUM = 10;


    /**
     * Query files by page.
     * @param listFilesDTO
     * @return
     */
    public List listFiles(ListFilesDTO listFilesDTO) {
        try {
            ListFilesRequest listFilesRequest = new ListFilesRequest();
            // File path: "Workflow/Name of the desired workflow/Name of the directory/Name of the latest folder"
            // Workflow/My first workflow/MaxCompute/ODS layer. Do not add "DataStudio" at the start of the path.
            listFilesRequest.setFileFolderPath(listFilesDTO.getFileFolderPath());
            // The code type of the files. You can specify multiple code types for files. Separate the code types with commas (,), such as 10,23.
            listFilesRequest.setFileTypes(listFilesDTO.getFileTypes());
            // The keyword in the file names. Fuzzy match is supported.
            listFilesRequest.setKeyword(listFilesDTO.getKeyword());
            // The ID of the node that is scheduled.
            listFilesRequest.setNodeId(listFilesDTO.getNodeId());
            // The owner of the files.
            listFilesRequest.setOwner(listFilesDTO.getOwner());
            // The number of the page to return.
            listFilesRequest.setPageNumber(listFilesDTO.getPageNumber() <= 0 ? 1 : listFilesDTO.getPageNumber());
            // The number of entries to return on each page.
            listFilesRequest.setPageSize(listFilesDTO.getPageSize() <= 10 ? 10 : listFilesDTO.getPageSize());
            // The ID of the DataWorks workspace.
            listFilesRequest.setProjectId(listFilesDTO.getProjectId());
            // The module to which the files belong.
            listFilesRequest.setUseType(listFilesDTO.getUseType());
            ListFilesResponse listFilesResponse = dataWorksOpenApiClient.createClient()
                    .getAcsResponse(listFilesRequest);
            ListFilesResponse.Data fileData = listFilesResponse.getData();
            if (fileData.getFiles() != null && !fileData.getFiles().isEmpty()) {
                for (ListFilesResponse.Data.File file : fileData.getFiles()) {
                    // The ID of the workflow.
                    System.out.println(file.getBusinessId());
                    // The ID of the file.
                    System.out.println(file.getFileId());
                    // The name of the file.
                    System.out.println(file.getFileName());
                    // The code type of the file, such as 10.
                    System.out.println(file.getFileType());
                    // The ID of the node.
                    System.out.println(file.getNodeId());
                    // The ID of the folder.
                    System.out.println(file.getFileFolderId());
                }
            }
            return fileData.getFiles();
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return null;
    }


    /**
     * Create a file.
     * @param createFileDTO
     */
    public Long createFile(CreateFileDTO createFileDTO) {
        try {
            CreateFileRequest createFileRequest = new CreateFileRequest();
            // The advanced configurations of the node.
            createFileRequest.setAdvancedSettings(createFileDTO.getAdvancedSettings());
            // Specifies whether to enable the automatic parsing feature for the file. This parameter is required.
            createFileRequest.setAutoParsing(createFileDTO.getAutoParsing());
            // The interval between automatic reruns after an error occurs. Unit: milliseconds. Maximum value: 1800000 (30 minutes).
            createFileRequest.setAutoRerunIntervalMillis(createFileDTO.getAutoRerunIntervalMillis());
            // The number of automatic retries.
            createFileRequest.setAutoRerunTimes(createFileDTO.getAutoRerunTimes());
            // The name of the connected data source that you want to use to run the node.  This parameter is required.
            createFileRequest.setConnectionName(createFileDTO.getConnectionName());
            // The code of the file. This parameter is required.
            createFileRequest.setContent(createFileDTO.getContent());
            // The CRON expression that represents the periodic scheduling policy of the node. This parameter is required.
            createFileRequest.setCronExpress(createFileDTO.getCronExpress());
            // The type of the scheduling cycle. This parameter is required.
            createFileRequest.setCycleType(createFileDTO.getCycleType());
            // The IDs of the nodes on which the current node depends. The instance that is generated for the node in the current cycle depends on the instances that are generated for the specified nodes in the previous cycle.
            createFileRequest.setDependentNodeIdList(createFileDTO.getDependentNodeIdList());
            // The type of the cross-cycle scheduling dependency for the current node. This parameter is required.
            createFileRequest.setDependentType(createFileDTO.getDependentType());
            // The end timestamp of automatic scheduling, in milliseconds. 
            createFileRequest.setEndEffectDate(createFileDTO.getEndEffectDate());
            // The description of the file.
            createFileRequest.setFileDescription(createFileDTO.getFileDescription());
            // The path of the file. This parameter is required.
            createFileRequest.setFileFolderPath(createFileDTO.getFileFolderPath());
            // The name of the file. This parameter is required.
            createFileRequest.setFileName(createFileDTO.getFileName());
            // The code type of the file. This parameter is required.
            createFileRequest.setFileType(createFileDTO.getFileType());
            // The output name of the file on which the current file depends. If you specify multiple output names, separate them with commas (,). This parameter is required.
            createFileRequest.setInputList(createFileDTO.getInputList());
            // The ID of the Alibaba Cloud account that is used by the file owner. If this parameter is not configured, the ID of the Alibaba Cloud account of the user who calls the operation is used.  This parameter is required.
            createFileRequest.setOwner(createFileDTO.getOwner());
            // The scheduling parameter. 
            createFileRequest.setParaValue(createFileDTO.getParaValue());
            // The ID of the workspace. This parameter is required.
            createFileRequest.setProjectId(createFileDTO.getProjectId());
            // The rerun type for the node.
            createFileRequest.setRerunMode(createFileDTO.getRerunMode());
            // The resource group that you want to use to run the node. This parameter is required.
            createFileRequest.setResourceGroupIdentifier(createFileDTO.getResourceGroupIdentifier());
            // The scheduling type of the node.
            createFileRequest.setSchedulerType(createFileDTO.getSchedulerType());
            // The start timestamp of automatic scheduling, in milliseconds.
            createFileRequest.setStartEffectDate(createFileDTO.getStartEffectDate());
            // Specifies whether to suspend the scheduling of the node.
            createFileRequest.setStop(createFileDTO.getStop());
            CreateFileResponse createFileResponse = dataWorksOpenApiClient.createClient()
                    .getAcsResponse(createFileRequest);
            // requestId
            System.out.println(createFileResponse.getRequestId());
            // fileId
            System.out.println(createFileResponse.getData());
            return createFileResponse.getData();
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return null;
    }


    /**
     * Update a file.  
     *
     * @param updateFileDTO
     */
    public boolean updateFile(UpdateFileDTO updateFileDTO) {
        try {
            UpdateFileRequest updateFileRequest = new UpdateFileRequest();
            // The advanced configurations of the node. For more information, see the related documentation.
            updateFileRequest.setAdvancedSettings(updateFileDTO.getAdvancedSettings());
            // Specifies whether to enable the automatic parsing feature for the file.
            updateFileRequest.setAutoParsing(updateFileDTO.getAutoParsing());
            // The interval between automatic reruns after an error occurs. Unit: milliseconds. Maximum value: 1800000 (30 minutes). 
            updateFileRequest.setAutoRerunIntervalMillis(updateFileDTO.getAutoRerunIntervalMillis());
            // The number of automatic reruns that are allowed after an error occurs.
            updateFileRequest.setAutoRerunTimes(updateFileDTO.getAutoRerunTimes());
            // The name of the data source that you want to use to run the node.
            updateFileRequest.setConnectionName(updateFileDTO.getConnectionName());
            // The code of the file.
            updateFileRequest.setContent(updateFileDTO.getContent());
            // The CRON expression that represents the periodic scheduling policy of the node. 
            updateFileRequest.setCronExpress(updateFileDTO.getCronExpress());
            // The type of the scheduling cycle. Valid values: NOT_DAY and DAY. The value NOT_DAY indicates that the node is scheduled to run by minute or hour. The value DAY indicates that the node is scheduled to run by day, week, or month.
            updateFileRequest.setCycleType(updateFileDTO.getCycleType());
            // The ID of the node on which the node that corresponds to the file depends when the DependentType parameter is set to USER_DEFINE. If you specify multiple IDs, separate them with commas (,).
            updateFileRequest.setDependentNodeIdList(updateFileDTO.getDependentNodeIdList());
            // The type of the cross-cycle scheduling dependency for the node that corresponds to the file.
            updateFileRequest.setDependentType(updateFileDTO.getDependentType());
            // The end timestamp of automatic scheduling, in milliseconds. 
            updateFileRequest.setEndEffectDate(updateFileDTO.getEndEffectDate());
            // The description of the file.
            updateFileRequest.setFileDescription(updateFileDTO.getFileDescription());
            // The path where the file resides.
            updateFileRequest.setFileFolderPath(updateFileDTO.getFileFolderPath());
            // The ID of the file.
            updateFileRequest.setFileId(updateFileDTO.getFileId());
            // The name of the file.
            updateFileRequest.setFileName(updateFileDTO.getFileName());
            // The output name of the file on which the current file depends. If you specify multiple output names, separate them with commas (,).
            updateFileRequest.setInputList(updateFileDTO.getInputList());
            // The output of the file.
            updateFileRequest.setOutputList(updateFileDTO.getOutputList());
            // The ID of the file owner.
            updateFileRequest.setOwner(updateFileDTO.getOwner());
            // The scheduling parameter.
            updateFileRequest.setParaValue(updateFileDTO.getParaValue());
            // The ID of the DataWorks workspace.
            updateFileRequest.setProjectId(updateFileDTO.getProjectId());
            // The rerun type for the node that corresponds to the file. Set the value to ALL_ALLOWED.
            updateFileRequest.setRerunMode(updateFileDTO.getRerunMode());
            // The identifier of the resource group that you want to use to run the node.
            updateFileRequest.setResourceGroupIdentifier(updateFileDTO.getResourceGroupIdentifier());
            // The scheduling type of the node. Set the value to NORMAL.
            updateFileRequest.setSchedulerType(updateFileDTO.getSchedulerType());
            // The start timestamp of automatic scheduling, in milliseconds.
            updateFileRequest.setStartEffectDate(updateFileDTO.getStartEffectDate());
            // Specifies whether to immediately run the node after the node is deployed.
            updateFileRequest.setStartImmediately(updateFileDTO.getStartImmediately());
            // Specifies whether to suspend the scheduling of the node.
            updateFileRequest.setStop(updateFileDTO.getStop());
            UpdateFileResponse updateFileResponse = dataWorksOpenApiClient.createClient()
                    .getAcsResponse(updateFileRequest);
            // requestId
            System.out.println(updateFileResponse.getRequestId());
            // The update result. Valid values: True and False.
            System.out.println(updateFileResponse.getSuccess());
            return updateFileResponse.getSuccess();
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return false;
    }


    /**
     * Delete a file.
     * @param deleteFileDTO
     * @return
     * @throws InterruptedException
     */
    public boolean deleteFile(DeleteFileDTO deleteFileDTO) throws InterruptedException {
        try {


            DeleteFileRequest deleteFileRequest = new DeleteFileRequest();
            deleteFileRequest.setFileId(deleteFileDTO.getFileId());
            deleteFileRequest.setProjectId(deleteFileDTO.getProjectId());
            DeleteFileResponse deleteFileResponse = dataWorksOpenApiClient.createClient()
                    .getAcsResponse(deleteFileRequest);
            System.out.println(deleteFileResponse.getRequestId());
            System.out.println(deleteFileResponse.getDeploymentId());


            GetDeploymentRequest getDeploymentRequest = new GetDeploymentRequest();
            getDeploymentRequest.setProjectId(deleteFileDTO.getProjectId());
            getDeploymentRequest.setDeploymentId(deleteFileResponse.getDeploymentId());
            for (int i = 0; i < CYCLE_NUM; i++) {
                GetDeploymentResponse getDeploymentResponse = dataWorksOpenApiClient.createClient()
                        .getAcsResponse(getDeploymentRequest);
                // The status of the deployment task. Valid values: 0, 1, and 2. The value 0 indicates that the deployment task is ready. The value 1 indicates that the deployment task is successful. The value 2 indicates that the deployment task failed. 
                Integer deleteStatus = getDeploymentResponse.getData().getDeployment().getStatus();
                // Perform a round robin to check the status of the file.
                if (deleteStatus == 1) {
                    System.out.println("File deleted.");
                    break;
                } else {
                    System.out.println("Deleting file...");
                    Thread.sleep(1000L);
                }
            }


            GetProjectRequest getProjectRequest = new GetProjectRequest();
            getProjectRequest.setProjectId(deleteFileDTO.getProjectId());
            GetProjectResponse getProjectResponse = dataWorksOpenApiClient.createClient()
                    .getAcsResponse(getProjectRequest);
            // The type of the environment. A workspace in standard mode provides both the development and production environments, and a workspace in basic mode provides only the production environment.
            Boolean standardMode = getProjectResponse.getData().getEnvTypes().size() == 2;
            if (standardMode) {
                // If the workspace is in standard mode, you must deploy the operation of deleting the file to the production environment to make the operation take effect.
                DeployFileRequest deployFileRequest = new DeployFileRequest();
                deployFileRequest.setProjectId(deleteFileDTO.getProjectId());
                deployFileRequest.setFileId(deleteFileDTO.getFileId());
                DeployFileResponse deployFileResponse = dataWorksOpenApiClient.createClient()
                        .getAcsResponse(deployFileRequest);
                getDeploymentRequest.setDeploymentId(deployFileResponse.getData());
                for (int i = 0; i < CYCLE_NUM; i++) {
                    GetDeploymentResponse getDeploymentResponse = dataWorksOpenApiClient.createClient()
                            .getAcsResponse(getDeploymentRequest);
                    // The status of the deployment task. Valid values: 0, 1, and 2. The value 0 indicates that the deployment task is ready. The value 1 indicates that the deployment task is successful. The value 2 indicates that the deployment task failed. 
                    Integer deleteStatus = getDeploymentResponse.getData().getDeployment().getStatus();
                    // Perform a round robin to check the status of the file.
                    if (deleteStatus == 1) {
                        System.out.println("File deleted.");
                        break;
                    } else {
                        System.out.println("Deleting file...");
                        Thread.sleep(1000L);
                    }
                }
            }
            return true;
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return false;
    }


    /**
     * Query files.
     * @param getFileDTO
     */
    public GetFileResponse.Data.File getFile(GetFileDTO getFileDTO) {
        try {
            GetFileRequest getFileRequest = new GetFileRequest();
            getFileRequest.setFileId(getFileDTO.getFileId());
            getFileRequest.setProjectId(getFileDTO.getProjectId());
            getFileRequest.setNodeId(getFileDTO.getNodeId());
            GetFileResponse getFileResponse = dataWorksOpenApiClient.createClient().getAcsResponse(getFileRequest);
            System.out.println(getFileResponse.getRequestId());
            GetFileResponse.Data.File file = getFileResponse.getData().getFile();
            System.out.println(file.getFileName());
            System.out.println(file.getFileType());
            System.out.println(file.getNodeId());
            System.out.println(file.getCreateUser());
            return file;
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return null;
    }


    /**
     * @param deployFileDTO
     * @return
     * @throws InterruptedException
     */
    public Boolean deployFile(DeployFileDTO deployFileDTO) throws InterruptedException {
        try {
            GetProjectRequest getProjectRequest = new GetProjectRequest();
            getProjectRequest.setProjectId(deployFileDTO.getProjectId());
            GetProjectResponse getProjectResponse = dataWorksOpenApiClient.createClient()
                    .getAcsResponse(getProjectRequest);
            // The type of the environment. A workspace in standard mode provides both the development and production environments, and a workspace in basic mode provides only the production environment.
            Boolean standardMode = getProjectResponse.getData().getEnvTypes().size() == 2;
            if (standardMode) {
                SubmitFileRequest submitFileRequest = new SubmitFileRequest();
                submitFileRequest.setFileId(deployFileDTO.getFileId());
                submitFileRequest.setProjectId(deployFileDTO.getProjectId());
                SubmitFileResponse submitFileResponse = dataWorksOpenApiClient.createClient()
                        .getAcsResponse(submitFileRequest);
                System.out.println("submit file requestId:" + submitFileResponse.getRequestId());
                System.out.println("submit file deploymentId:" + submitFileResponse.getData());
                for (int i = 0; i < CYCLE_NUM; i++) {
                    GetDeploymentRequest getDeploymentRequest = new GetDeploymentRequest();
                    getDeploymentRequest.setProjectId(deployFileDTO.getProjectId());
                    getDeploymentRequest.setDeploymentId(submitFileResponse.getData());
                    GetDeploymentResponse getDeploymentResponse = dataWorksOpenApiClient.createClient()
                            .getAcsResponse(getDeploymentRequest);
                    // The status of the deployment task. Valid values: 0, 1, and 2. The value 0 indicates that the deployment task is ready. The value 1 indicates that the deployment task is successful. The value 2 indicates that the deployment task failed. 
                    Integer deleteStatus = getDeploymentResponse.getData().getDeployment().getStatus();
                    // Perform a round robin to check the status of the file.
                    if (deleteStatus == 1) {
                        System.out.println("File submitted.");
                        break;
                    } else {
                        (a) System.out.println("Submitting file...");
                        Thread.sleep(1000L);
                    }
                }
            }
            DeployFileRequest deployFileRequest = new DeployFileRequest();
            deployFileRequest.setFileId(deployFileDTO.getFileId());
            deployFileRequest.setProjectId(deployFileDTO.getProjectId());
            DeployFileResponse deployFileResponse = dataWorksOpenApiClient.createClient()
                    .getAcsResponse(deployFileRequest);
            System.out.println("deploy file requestId:" + deployFileResponse.getRequestId());
            System.out.println("deploy file deploymentId:" + deployFileResponse.getData());
            for (int i = 0; i < CYCLE_NUM; i++) {
                GetDeploymentRequest getDeploymentRequest = new GetDeploymentRequest();
                getDeploymentRequest.setProjectId(deployFileDTO.getProjectId());
                getDeploymentRequest.setDeploymentId(deployFileResponse.getData());
                GetDeploymentResponse getDeploymentResponse = dataWorksOpenApiClient.createClient()
                        .getAcsResponse(getDeploymentRequest);
                // The status of the deployment task. Valid values: 0, 1, and 2. The value 0 indicates that the deployment task is ready. The value 1 indicates that the deployment task is successful. The value 2 indicates that the deployment task failed. 
                Integer deleteStatus = getDeploymentResponse.getData().getDeployment().getStatus();
                // Perform a round robin to check the status of the file.
                if (deleteStatus == 1) {
                    System.out.println("File deployed.");
                    break;
                } else {
                    (a) System.out.println("Deploying file...");
                    Thread.sleep(1000L);
                }
            }
            return true;
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return false;


    }


    public List runSmokeTest(RunSmokeTestDTO runSmokeTestDTO) {
        try {
            RunSmokeTestRequest runSmokeTestRequest = new RunSmokeTestRequest();
            runSmokeTestRequest.setBizdate(runSmokeTestDTO.getBizdate());
            runSmokeTestRequest.setNodeId(runSmokeTestDTO.getNodeId());
            runSmokeTestRequest.setNodeParams(runSmokeTestDTO.getNodeParams());
            runSmokeTestRequest.setName(runSmokeTestDTO.getName());
            runSmokeTestRequest.setProjectEnv(runSmokeTestDTO.getProjectEnv());
            RunSmokeTestResponse runSmokeTestResponse = dataWorksOpenApiClient.createClient()
                    .getAcsResponse(runSmokeTestRequest);
            System.out.println(runSmokeTestResponse.getRequestId());
            // DAGID
            System.out.println(runSmokeTestResponse.getData());


            ListInstancesRequest listInstancesRequest = new ListInstancesRequest();
            listInstancesRequest.setDagId(runSmokeTestResponse.getData());
            listInstancesRequest.setProjectId(runSmokeTestDTO.getProjectId());
            listInstancesRequest.setProjectEnv(runSmokeTestDTO.getProjectEnv());
            listInstancesRequest.setNodeId(runSmokeTestDTO.getNodeId());
            listInstancesRequest.setPageNumber(1);
            listInstancesRequest.setPageSize(10);
            ListInstancesResponse listInstancesResponse = dataWorksOpenApiClient.createClient()
                    .getAcsResponse(listInstancesRequest);
            System.out.println(listInstancesResponse.getRequestId());
            List instances = listInstancesResponse.getData().getInstances();
            if (CollectionUtils.isEmpty(instances)) {
                return null;
            }
            return instances;
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return null;
    }


    public InstanceDetail getInstanceLog(Long instanceId, String projectEnv) {
        try {
            GetInstanceLogRequest getInstanceLogRequest = new GetInstanceLogRequest();
            getInstanceLogRequest.setInstanceId(instanceId);
            getInstanceLogRequest.setProjectEnv(projectEnv);
            GetInstanceLogResponse getInstanceLogResponse = dataWorksOpenApiClient.createClient()
                    .getAcsResponse(getInstanceLogRequest);
            System.out.println(getInstanceLogResponse.getRequestId());


            GetInstanceRequest getInstanceRequest = new GetInstanceRequest();
            getInstanceRequest.setInstanceId(instanceId);
            getInstanceRequest.setProjectEnv(projectEnv);
            GetInstanceResponse getInstanceResponse = dataWorksOpenApiClient.createClient()
                    .getAcsResponse(getInstanceRequest);
            System.out.println(getInstanceResponse.getRequestId());
            System.out.println(getInstanceResponse.getData());


            InstanceDetail instanceDetail = new InstanceDetail();
            instanceDetail.setInstance(getInstanceResponse.getData());
            instanceDetail.setInstanceLog(getInstanceLogResponse.getData());
            return instanceDetail;
        } catch (ServerException e) {
            e.printStackTrace();
        } catch (ClientException e) {
            e.printStackTrace();
            // The request ID.
            System.out.println(e.getRequestId());
            // The error code.
            System.out.println(e.getErrCode());
            // The error message.
            System.out.println(e.getErrMsg());
        }
        return null;
    }
}

Step 4: Develop an IDE controller

You need to define an IDE controller that provides the API operations that can be called for routing during the frontend development.

package com.aliyun.dataworks.demo;


import com.aliyun.dataworks.dto.*;
import com.aliyun.dataworks.services.BusinessService;
import com.aliyun.dataworks.services.FileService;
import com.aliyun.dataworks.services.FolderService;
import com.aliyun.dataworks.services.ProjectService;
import com.aliyuncs.dataworks_public.model.v20200518.*;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.*;


import java.util.List;


/**
 * @author dataworks demo
 */
@RestController
@RequestMapping("/ide")
public class IdeController {


    @Autowired
    private FileService fileService;


    @Autowired
    private FolderService folderService;


    @Autowired
    private BusinessService businessService;


    @Autowired
    private ProjectService projectService;


    /**
     * for list those files
     *
     * @param listFilesDTO
     * @return ListFilesResponse.Data.File
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @GetMapping("/listFiles")
    public List listFiles(ListFilesDTO listFilesDTO) {
        return fileService.listFiles(listFilesDTO);
    }


    /**
     * for list those folders
     *
     * @param listFoldersDTO
     * @return ListFoldersResponse.Data.FoldersItem
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @GetMapping("/listFolders")
    public List listFolders(ListFoldersDTO listFoldersDTO) {
        return folderService.listFolders(listFoldersDTO);
    }


    /**
     * for create the folder
     *
     * @param createFolderDTO
     * @return boolean
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @PostMapping("/createFolder")
    public boolean createFolder(@RequestBody CreateFolderDTO createFolderDTO) {
        return folderService.createFolder(createFolderDTO);
    }


    /**
     * for update the folder
     *
     * @param updateFolderDTO
     * @return boolean
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @PostMapping("/updateFolder")
    public boolean updateFolder(@RequestBody UpdateFolderDTO updateFolderDTO) {
        return folderService.updateFolder(updateFolderDTO);
    }


    /**
     * for get the file
     *
     * @param getFileDTO
     * @return GetFileResponse.Data.File
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @GetMapping("/getFile")
    public GetFileResponse.Data.File getFile(GetFileDTO getFileDTO) {
        return fileService.getFile(getFileDTO);
    }


    /**
     * for create the file
     *
     * @param createFileDTO
     * @return fileId
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @PostMapping("/createFile")
    public Long createFile(@RequestBody CreateFileDTO createFileDTO) {
        return fileService.createFile(createFileDTO);
    }


    /**
     * for update the file
     *
     * @param updateFileDTO
     * @return boolean
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @PostMapping("/updateFile")
    public boolean updateFile(@RequestBody UpdateFileDTO updateFileDTO) {
        return fileService.updateFile(updateFileDTO);
    }


    /**
     * for deploy the file
     *
     * @param deployFileDTO
     * @return boolean
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @PostMapping("/deployFile")
    public boolean deployFile(@RequestBody DeployFileDTO deployFileDTO) {
        try {
            return fileService.deployFile(deployFileDTO);
        } catch (Exception e) {
            System.out.println(e);
        }
        return false;
    }


    /**
     * for delete the file
     *
     * @param deleteFileDTO
     * @return
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @DeleteMapping("/deleteFile")
    public boolean deleteFile(DeleteFileDTO deleteFileDTO) {
        try {
            return fileService.deleteFile(deleteFileDTO);
        } catch (Exception e) {
            System.out.println(e);
        }
        return false;
    }


    /**
     * for delete the folder
     *
     * @param deleteFolderDTO
     * @return
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @DeleteMapping("/deleteFolder")
    public boolean deleteFolder(DeleteFolderDTO deleteFolderDTO) {
        return folderService.deleteFolder(deleteFolderDTO);
    }


    /**
     * list businesses
     *
     * @param listBusinessesDTO
     * @return
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @GetMapping("/listBusinesses")
    public List listBusiness(ListBusinessesDTO listBusinessesDTO) {
        return businessService.listBusiness(listBusinessesDTO);
    }


    /**
     * create a business
     *
     * @param createBusinessDTO
     * @return
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @PostMapping("/createBusiness")
    public Long createBusiness(@RequestBody CreateBusinessDTO createBusinessDTO) {
        return businessService.createBusiness(createBusinessDTO);
    }


    /**
     * update a business
     *
     * @param updateBusinessDTO
     * @return
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @PostMapping("/updateBusiness")
    public boolean updateBusiness(@RequestBody UpdateBusinessDTO updateBusinessDTO) {
        return businessService.updateBusiness(updateBusinessDTO);
    }


    /**
     * delete a business
     *
     * @param deleteBusinessDTO
     * @return
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @PostMapping("/deleteBusiness")
    public boolean deleteBusiness(@RequestBody DeleteBusinessDTO deleteBusinessDTO) {
        return businessService.deleteBusiness(deleteBusinessDTO);
    }



    /**
     * @param pageNumber
     * @param pageSize
     * @return
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @GetMapping("/listProjects")
    public ListProjectsResponse.PageResult listProjects(Integer pageNumber, Integer pageSize) {
        return projectService.listProjects(pageNumber, pageSize);
    }


    /**
     * @param runSmokeTestDTO
     * @return
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @PutMapping("/runSmokeTest")
    public List runSmokeTest(@RequestBody RunSmokeTestDTO runSmokeTestDTO) {
        return fileService.runSmokeTest(runSmokeTestDTO);
    }


    /**
     * @param instanceId
     * @param projectEnv
     * @return
     */
    @CrossOrigin(origins = "http://localhost:8080")
    @GetMapping("/getLog")
    public InstanceDetail getLog(@RequestParam Long instanceId, @RequestParam String projectEnv) {
        return fileService.getInstanceLog(instanceId, projectEnv);
    }


}

Frontend code development

  1. Initialize the editor, directory tree, and terminal.

    Sample code:

    const App: FunctionComponent = () => {
      const editorRef = useRef(null);
      const termianlRef = useRef(null);
      const [terminal, setTerminal] = useState();
      const [editor, setEditor] = useState();
      const [expnadedKeys, setExpandedKeys] = useState();
      const [workspace, setWorkspace] = useState();
      const [workspaces, setWorkspaces] = useState<{ label: string, value: number }[]>([]);
      const [dataSource, setDataSource] = useState();
      const [selectedFile, setSelectedFile] = useState();
      const [loading, setLoading] = useState(false);
      // Create an editor instance.
      useEffect(() => {
        if (editorRef.current) {
          const nextEditor = monaco.editor.create(editorRef.current, editorOptions);
          setEditor(nextEditor);
          return () => { nextEditor.dispose(); };
        }
      }, [editorRef.current]);
      // Add a keyboard input event that is used to save the file.
      useEffect(() => {
        editor?.addCommand(monaco.KeyMod.CtrlCmd | monaco.KeyCode.KeyS, () => {
          if (!workspace) {
            showTips('Please select workspace first');
            return;
          }
          saveFile(workspace, editor, selectedFile);
        });
      }, [editor, workspace, selectedFile]);
      // Create a terminal instance.
      useEffect(() => {
        if (termianlRef.current) {
          const term: NextTerminal = new Terminal(terminalOptions) as any;
          term.pointer = -1;
          term.stack = [];
          setTerminal(term);
          const fitAddon = new FitAddon();
          term.loadAddon(fitAddon);
          term.open(termianlRef.current);
          fitAddon.fit();
          term.write('$ ');
          return () => { term.dispose(); };
        }
      }, [termianlRef.current]);
      // Register a terminal input event.
      useEffect(() => {
        const event = terminal?.onKey(e => onTerminalKeyChange(e, terminal, dataSource, workspace));
        return () => {
          event?.dispose();
        };
      }, [terminal, dataSource, workspace]);
      // Query data sources in the directory tree.
      useEffect(() => {
        workspace && (async () => {
          setLoading(true);
          const nextDataSource = await getTreeDataSource(workspace, workspaces);
          const defaultKey = nextDataSource?.[0]?.key;
          defaultKey && setExpandedKeys([defaultKey]);
          setDataSource(nextDataSource);
          setLoading(false);
        })();
      }, [workspace]);
      // When you click a file in the directory tree, you can query the details and code of the file.
      useEffect(() => {
        workspace && selectedFile && (async () => {
          setLoading(true);
          const file = await getFileInfo(workspace, selectedFile);
          editor?.setValue(file.content);
          editor?.getAction('editor.action.formatDocument').run();
          setLoading(false);
        })();
      }, [selectedFile]);
      // Query workspaces.
      useEffect(() => {
        (async () => {
          const list = await getWorkspaceList();
          setWorkspaces(list);
        })();
      }, []);
      const onExapnd = useCallback((keys: number[]) => { setExpandedKeys(keys); }, []);
      const onWorkspaceChange = useCallback((value: number) => { setWorkspace(value) }, []);
      const onTreeNodeSelect = useCallback((key: number[]) => { key[0] && setSelectedFile(key[0]) }, []);
      return (
        
    Workspace:
    ); }; export default App;