百度智能云

All Product Document

          Object Storage

          Object Management

          Object Uploading

          The Simplest Uploading

          • Basic procedure

            1.Create BosClient 2.Call putObject() method, and upload object in the following 4 ways Forms of string, DataUrl, file and blob object

          • Sample Code

              function done(response) {
                  // Uploaded successfully 
              }
              function fail(fail) {
                  // Failed to upload 
              }
              
              // Upload in string form 
              client.putObjectFromString(bucket, object, 'hello world')
                .then(done)
                .catch(fail);
              
              // Upload in buffer form
              var buffer = new Buffer('hello world'); client.putObject(bucket, object, buffer)
                .then(done)
                .catch(fail);
              
              // Upload in file form only supports Node.js environment 
              client.putObjectFromFile(bucket, object, <path-to-file>)
                .then(done)
                .catch(fail);
              
              // Upload in blob form only supports browser environment
              client.putObjectFromBlob(bucket, object, <blob object>)
                .then(done)
                .catch(fail);

            Note: Upload object to BOS in the file form. objects no more than 5 GB can be uploaded through putObject. After the putObject request is successfully processed, BOS will return the ETag of object as a file identifier in the Header.

          Set Http Header of Object and Custom Metadata

          SDK is to call the background HTTP interface in essence, so BOS service permits users to customize Http Header. Users are also permitted to add custom Meta information for the object to be uploaded. With putObjectFromFile() function as an example, the following codes can be used for processing:

          • Sample Code

              let options = {
                content-length: <file.size>, // Add http header
                content-type: 'application/json', // Add http header
              
                x-bce-meta-foo1: 'bar1', // Add custom meta information
                x-bce-meta-foo2: 'bar2', // Add custom meta information
                x-bce-meta-foo3: 'bar3' // Add custom meta information 
              }
              client.putObjectFromFile(bucket, object, <path-to-file>, options)
                .then(done)
                .catch(fail);

            Note: The key to custom Meta information needs to be started with x-bce-meta-.

          View the Object in Bucket

          View the object list in bucket.

          • Basic procedure

            1.Create BosClient 2.Execute listobjects() method

          • Sample Code

              client.listObjects(<bucketName>)
                .then(function (response) {
                    var contents = response.body.contents;
                    for (var i = 0, l = contents.length; i < l; i++) {
                        console.log(contents[i].key);
                    }
                })
                .catch(function (error) {
                    // Failed to inquire 
                });

            Note:

            • By default, only 1,000 objects are returned and the IsTruncated value is True if bucket has over 1,000 objects. Besides, NextMarker is returned as the starting point for the next reading.
            • Use the Marker parameter to access more objects in batches. Refer to [query expansion](#query expansion).

          Query Expansion

          Users can set more extended queries by setting listobjects parameters. The settable extended parameters are as follows

          Parameter name Description Default value
          maxKeys Set the maximum number of objects returned this time as 1000. 1000
          prefix Sets the prefix of ObjectKey, which covers and starts with the value of Prefix.
          is usually used together with Delimiter in [Searching Simulation Folders](#Searching Simulation Folders).
          -
          delimiter A delimiter, used to make ObjectKey hierarchies.
          is usually used together with prefix in [Searching Simulation Folders](#Searching Simulation Folders).
          The ObjectKey between Prefix and Delimiter character (showing for the first time) is called: commonPrefixes
          -
          marker A character string, used to set the starting position of the returned result.
          After the Marker value is set, the returned object returns alphabetically after that value.
          -
          • Sample Code

              // Parameter setting 
              var options = {
                  delimiter: '/',
                  marker: '123'
              };
              
              client.listObjects(<bucketName>, options)
                  .then(function (response) {
                      var contents = response.body.contents;
                      for (var i = 0, l = contents.length; i < l; i++) {
                          console.log(contents[i].key);
                      }
                  })
                  .catch(function (error) {
                      // Failed to inquire
                  });

          Search for Simulation Folders

          BOS itself is a storage system of (<Key>,<Value>), so there is no concept of "folder" in principle, but you can simulate folder function through the cooperation of delimiter and prefix parameters.

          Suppose bucket have 5 files: bos.jpg, fun/, fun/test.jpg, fun/movie/001.avi, fun/movie/007.avi ("/" can be used as a delimiter to simulate folders.)

          Lists All Files in Simulation Folders Recursively

          Set the Prefix parameter to access all files in a simulation folder:

          // Parameter setting
          let options = {
          prefix: 'fun/' // Recursively list all files in the fun directory. 
          };
          
          
          client.listObjects(<bucketName>, options)
          .then(function (response) {
              console.log('Objects:');
              var contents = response.body.contents;
              for (var i = 0, l = contents.length; i < l; i++) {
                  console.log(contents[i].key);
              }
          })
          .catch(function (error) {
              // Failed to inquire
          });

          Output:

          Objects:
          fun/
          fun/movie/001.avi
          fun/movie/007.avi
          fun/test.jpg

          Check the Files and Subfolders in Simulation Folders

          The files and subfolders in simulation folders can be listed with the combination of Prefix and Delimiter:

          // Parameter setting 
          let options = {
              prefix: 'fun/', // List all files and folders under fun file directory.
              delimiter: '/' // "/" is a folder separator. 
          };
          
          client.listObjects(<bucketName>, options)
          .then(function (response) {
              console.log('Objects:');
              var contents = response.body.contents;
              for (var i = 0, l = contents.length; i < l; i++) {
                  console.log(contents[i].key);
              }
              console.log('CommonPrefixs:');
              var commonPrefixes = response.body.commonPrefixes;
              for (i = 0, l = commonPrefixes.length; i < l; i++) {
                  console.log(commonPrefixes[i]);
              }
          })
          .catch(function (error) {
              // Failed to inquire
          });

          Output:

          Objects:
          fun/
          fun/test.jpg
          	
          CommonPrefixs:
          fun/movie/

          Note: The returned result in the list of objects is the files in fun folders. All subfolders under fun folder are shown in the list of CommonPrefixs. It is clear that the fun/movie/001.avi file and fun/movie/007.avi file are not listed because they are the files in the movie subfolder under the fun folder.

          Obtain Object (Only Support Node.js)

          Access Object Easily

          You can read object in a stream through the following codes.

          • Basic procedure

            1.Create BosClient 2.Execute getobject()

          • Sample Code

              let range = '0-100';
              client.getObject(<BucketName>, <Key>, range)
                  .then(function(response) {
                      let buffer = response.body;
                  });

            Note: Setting range as 0-100 represents the obtaining of data from 0 to 100 bytes only, and you can use function for upload by segment and breakpoint resume of files. If range is not set, the whole object is obtained. FilePath is the full path to the file, including the file name and type.

          Download Object to the Specified Path

          You can download object directly to the specified path through the following code.

          • Basic procedure

            1.Create BosClient 2.Execute the client.getobject ( ). 3.object can be downloaded to the specified path.

          • Sample Code

              let range = '0-100';
              client.getObjectToFile(<bucketName>, <key>, <filePath>, range)
                  .then(function() {
                      // Downloaded successfully 
                  });

            Note: Setting range as 0-100 represents the obtaining of data from 0 to 100 bytes only, and you can use function for upload by segment and breakpoint resume of files. If range is not set, the whole object is obtained.

          Get ObjectMetadata Only

          The getObjectMetadata() method can only get the ObjectMetadata rather than the object's entity.

          • Sample Code

              client.getObjectMetadata(<BucketName>, <ObjectKey>)
                .then(function(response) {
                    console.dir(response.http_headers);
                });

          Delete Object

          • Basic procedure

            1.Create BosClient 2.Execute deleteobject() method.

          • Sample Code

              // Delete Object
              client.deleteObject(<BucketName>, <ObjectKey>);   //Specify the bucket name of the object to be deleted and the object name. 
          • Complete example

            • Browser end: let BosClient = baidubce.sdk.BosClient
            • Node.js: import {BosClient} from '@baiducloud/sdk'

                let config = {
                    endpoint: <EndPoint>,            //Domain name of bucket region
                    credentials: {
                        ak: <AccessKeyID>,           //Your AK 
                        sk: <SecretAccessKey>       //Your SK
                    }
                };
                
                let client = new BosClient(config);
                
                client.deleteObject(<BucketName>, <ObjectKey>);

          Copy Object

          Simply Copy Object

          • Basic procedure

            1.Create BosClient 2.Execute copyobject( ) method.

          • Sample Code

              let options = {
                x-bce-meta-foo1: 'bar1', // Overwrite custom meta information 
                x-bce-meta-foo2: 'bar2', // Overwrite custom meta information
                x-bce-meta-foo3: 'bar3' // Overwrite custom meta information
              }
              client.copyObject(<SrcBucketName>, <SrcKey>, <DestBucketName>, <DestKey>, options);    //SrcBucketName, SrcKey is the original address.DestBucketName, DestKey is the address copied to destinaton.

          Multipart Upload of Object

          In addition to uploading files to BOS through putObject () method, BOS also provides another upload mode: Multipart Upload. You can use the Multipart Upload mode in the following application scenarios (but not limited to this), such as:

          • Breakpoint upload support is required.
          • The file to upload is larger than 5 GB.
          • The network conditions are poor, and the connection with BOS servers is often disconnected.
          • The file needs to be uploaded streaming.
          • The size of the uploaded file cannot be determined before uploading it.

          Multipart upload is a bit complex than direct upload, and multipart upload needs to be divided into 3 stages

          • Initialize upload (initiateMultipartUpload)
          • Upload part (uploadPartFromBlob)
          • Complete upload (completeMultipartUpload)

          Code Example of Browser End

          Divide Files into Parts

          let PART_SIZE = 5 * 1024 * 1024; // Specify part size 
          
          function getTasks(file, uploadId, bucketName, key) {
              let leftSize = file.size;
              let offset = 0;
              let partNumber = 1;
          
              let tasks = [];
          
              while (leftSize > 0) {
                  let partSize = Math.min(leftSize, PART_SIZE);
                  tasks.push({
                      file: file,
                      uploadId: uploadId,
                      bucketName: bucketName,
                      key: key,
                      partNumber: partNumber,
                      partSize: partSize,
                      start: offset,
                      stop: offset + partSize - 1
                  });
          
                  leftSize -= partSize;
                  offset += partSize;
                  partNumber += 1;
              }
              return tasks;
          }

          Process the Upload Logic of Each Part

          function uploadPartFile(state, client) {
              return function(task, callback) {
                  let blob = task.file.slice(task.start, task.stop + 1);
                  client.uploadPartFromBlob(task.bucketName, task.key, task.uploadId, task.partNumber, task.partSize, blob)
                      .then(function(res) {
                          ++state.loaded;
                          callback(null, res);
                      })
                      .catch(function(err) {
                          callback(err);
                      });
              };
          }

          Initialize UploadID, Initialize Multipart Upload, and Complete Upload

          let uploadId;
          client.initiateMultipartUpload(bucket, key, options)
              .then(function(response) {
                  uploadId = response.body.uploadId; // Initialize upload, and obtain the uploadId generated by server 
          
                  let deferred = sdk.Q.defer();
                  let tasks = getTasks(blob, uploadId, bucket, key);
                  let state = {
                      lengthComputable: true,
                      loaded: 0,
                      total: tasks.length
                  };
          
                  // To manage multipart upload, async (https://github.com/caolan/async) library is used for asynchronous processing. 
                  let THREADS = 2; // Number of parts uploaded simultaneously 
                  async.mapLimit(tasks, THREADS, uploadPartFile(state, client), function(err, results) {
                      if (err) {
                          deferred.reject(err);
                      } else {
                          deferred.resolve(results);
                      }
                  });
                  return deferred.promise;
              })
              .then(function(allResponse) {
                  let partList = [];
                  allResponse.forEach(function(response, index) {
                      // Generate list of parts
                      partList.push({
                          partNumber: index + 1,
                          eTag: response.http_headers.etag
                      });
                  });
                  return client.completeMultipartUpload(bucket, key, uploadId, partList); // Complete Upload.
              })
              .then(function (res) {
                  // Uploaded successfully
              })
              .catch(function (err) {
                  // Failed to upload, add your code
                  console.error(err);
              });

          Multipart Upload of Node.js End

          Divided Files into Parts, Initialize UploadID, and Upload Part

          let PART_SIZE = 5 * 1024 * 1024; // Specify part size
          let uploadId;
          client.initiateMultipartUpload(bucket, key, options)
              .then(function(response) {
                  uploadId = response.body.uploadId; // Initialize upload, and obtain the uploadId generated by server 
          
                  let deferred = sdk.Q.defer();
                  let tasks = getTasks(blob, uploadId, bucket, key);
                  let state = {
                      lengthComputable: true,
                      loaded: 0,
                      total: tasks.length
                  };
          
                  // To manage multipart upload, async (https://github.com/caolan/async) library is used for asynchronous processing. 
                  let THREADS = 2; // Number of parts uploaded simultaneously 
                  async.mapLimit(tasks, THREADS, uploadPartFile(state, client), function(err, results) {
                      if (err) {
                          deferred.reject(err);
                      } else {
                          deferred.resolve(results);
                      }
                  });
                  return deferred.promise;
              })
              .then(function(allResponse) {
                  let partList = [];
                  allResponse.forEach(function(response, index) {
                      // Generate list of parts
                      partList.push({
                          partNumber: index + 1,
                          eTag: response.http_headers.etag
                      });
                  });
                  return client.completeMultipartUpload(bucket, key, uploadId, partList); // Complete Upload.
              })
              .then(function (res) {
                  // Uploaded successfully 
              })
              .catch(function (err) {
                  // Failed to upload, add your code 
                  console.error(err);
              });
          
          function getTasks(file, uploadId, bucketName, key) {
              let leftSize = file.size;
              let offset = 0;
              let partNumber = 1;
          
              let tasks = [];
          
              while (leftSize > 0) {
                  let partSize = Math.min(leftSize, PART_SIZE);
                  tasks.push({
                      file: file,
                      uploadId: uploadId,
                      bucketName: bucketName,
                      key: key,
                      partNumber: partNumber,
                      partSize: partSize,
                      start: offset,
                      stop: offset + partSize - 1
                  });
          
                  leftSize -= partSize;
                  offset += partSize;
                  partNumber += 1;
              }
              return tasks;
          }
          function uploadPartFile(state, client) {
              return function(task, callback) {
                  return client.uploadPartFromFile(task.bucketName, task.key, task.uploadId, task.partNumber, task.partSize, task.file , task.start)
                      .then(function(res) {
                          ++state.loaded;
                          callback(null, res);
                      })
                      .catch(function(err) {
                          callback(err);
                      });
              };
          }

          Cancel Multipart Upload Event

          You can use the abortMultipartUpload method to cancel Multipart Upload.

          client.abortMultipartUpload(<BucketName>, <Objectkey>, <UploadID>);

          Get Unfinished Multipart Upload Event

          Users can obtain the uncompleted multipart upload events in bucket with listMultipartUploads method.

          client.listMultipartUploads(<bucketName>)
              .then(function (response) {
                  // Traverse all uploaded events.
                  for (var i = 0; i < response.body.multipartUploads.length; i++) {
                      console.log(response.body.multipartUploads[i].uploadId);
                  }
              });

          Get All Uploaded Part Information

          Users can obtain all uploaded blocks in an upload event with listParts method.

          client.listParts(<bucketName>, <key>, <uploadId>)
              .then(function (response) {
                  // Traverse all uploaded events. 
                  for (var i = 0; i < response.body.parts.length; i++) {
                      console.log(response.body.parts[i].partNumber);
                  }
              });
          Previous
          Bucket Management
          Next
          Android-SDK