Google cloud storage Can't upload file to GCS using gsutil

Copying file://InstreamImpression.csv.gz [Content-Type=application/octet-stream]... AccessDeniedException: 401 Login Requiredfe13d1e0fb408639_4...: 46.75 MB/46.77 MB CommandException: 1 files/objects could not be transferred. Seems like whole object is being transferred but gives 401 error at the end. And it's been happening for a while. Ran "gcloud auth login" couple times. But still the same error I am able to upload the files from different machine. Any idea?

Google cloud storage Error to create a disk in google compute (The resource '... diskTypes/{pd-ssd}' was not found

I try to create a disk in Google Compute. This is the process and error: gcloud compute disks create test --type={pd-ssd} For the following disks: - [test] choose a zone: [1] asia-east1-a [2] asia-east1-c [3] asia-east1-b [4] europe-west1-c [5] europe-west1-b [6] europe-west1-d [7] us-central1-a [8] us-central1-b [9] us-central1-c [10] us-central1-f Please enter your numeric choice: 9 NAME ZONE SIZE_GB TYPE STATUS ERROR: (gcloud.compute.disks.create) Some requests did not succ

Is google-cloud-storage pricing dependent on the bucket location?

i have dedicated server in canada and would like to use google-cloud-storage as offsite backup. so i will be pushing/uploading data from canada to google-cloud-storage located in us-east1. will there be any pricing difference if i create a bucket in us-central1 instead of us-east1 ? or the pricing is same for any location ? i ask this becuase i have been using amazon services and their pricing totally depends on the location of amazon server/services .

Google cloud storage Minimum storage duration for "Regional" or "Multi-Regional" GCS buckets

Google cloud storage documentation (https://cloud.google.com/storage/docs/storage-classes) suggests that there is "minimum storage duration" for Nearline (30days) and Coldline (90 days) storage but there is no description about regional and multi-regional storage. Does it mean that there is absolutely no minimum storage duration? Is the unit by microsecond, second, minute, hour, or a day? For example, (unrealistically) suppose that I created a google cloud storage bucket, and copied 10 petab

Google cloud storage Is this a serious Warning while accessing GCS Bucket directly from Dataproc Spark Job?

I am running a Spark 2.2 job on Dataproc and I need to access a bunch of avro files located in a GCP storage bucket. To be specific, I need to access the files DIRECTLY from the bucket (i.e. NOT first have them copy/pasted onto the Master machine, both because they might be very large and also for compliance reasons). I am using the gs://XXX notation to refer to the Bucket inside the Spark code, based on recommendations in this doc: https://cloud.google.com/dataproc/docs/concepts/connectors/clo

Google cloud storage PermissionDenied: 403 error when trying to run async Google Cloud Speech async transcribe

I'm getting the following error when trying to run an async transcription request on a .flac file hosted on google cloud. $ python3 transcribe_async.py gs://[file].flac Traceback (most recent call last): File "[]/anaconda3/lib/python3.6/site-packages/google/api_core/grpc_helpers.py", line 54, in error_remapped_callable return callable_(*args, **kwargs) File "[]/anaconda3/lib/python3.6/site-packages/grpc/_channel.py", line 514, in __call__ return _end_unary_response_blocking(state,

Google cloud storage Alternatives to JSON file credentials?

My Java backend server has to upload files to the Google Cloud Storage (GCS). Right now I just run public void store(MultipartFile multipartFile) throws IOException { Storage storage = StorageOptions.getDefaultInstance().getService(); storage.create( BlobInfo.newBuilder( BUCKET_NAME, Objects.requireNonNull(multipartFile.getOriginalFilename())) .build(), multipartFile.getBytes() ); } Having set G

Google cloud storage How do I resolve a "The policy has been modified by another process" error in Google Cloud SQL?

I'm trying to export the contents of a MySQL table from Google Cloud SQL into a Cloud Storage bucket, and I'm running into an error: The policy has been modified by another process. Please try again. Yesterday, I happily imported CSV data to my Cloud SQL database left and right, and when I tried to write some of the modified data from a query out to another CSV file, I got tripped up. So I followed the directions here to try to resolve my issue: https://cloud.google.com/sql/docs/mysql/impo

Google cloud storage BigQuery Transfer Service UI - run_date parameter

Has anybody had any success in applying a run_date parameter when creating a Transfer in BigQuery using the Transfer service UI ? I'm taking a CSV file from Google Cloud storage and I want to mirror this into my ingestion date partitioned table, table_a. Initally I set the destination table as table_a, which resulted in the following message in the job log: Partition suffix has to be set for date-partitioned tables. Please recreate your transfer config with a valid table name. For example, to lo

Google cloud storage how can I download Google Cloud Storage object by metadata info

Document here : https://developers.google.com/storage/docs/reference-methods#getobject I use this function try to Download a obejct. But only can response metadata info getFile: (file_id, callback)-> log("getFileMetadata") unless callback callback = (resp) -> log "Read Complete" ,resp params = path : "/storage/v1beta2/b/#{@BUCKET}/o/#{file_id}" method : "GET" headers: host: "storage.googleapis.com" "

Google cloud storage GCS Connector Class com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem not found

We are trying to run Hive queries on HDP 2.1 using GCS Connector, it was working fine until yesterday but since today morning our jobs are randomly started failing. When we restart them manually they just work fine. I suspect it's something to do with number of parallel Hive jobs running at a given point of time. Below is the error message: vertexId=vertex_1407434664593_37527_2_00, diagnostics=[Vertex Input: audience_history initializer failed., java.lang.ClassNotFoundException: Class com.g

Google cloud storage Google compute engine, How to send 301 to redirect to https

I want to be able to serve the website always in https. I am not apache specialist but this is how people do it out there: RewriteEngine on RewriteCond %{SERVER_PORT} !^443$ RewriteRule ^/(.*) https://%{HTTP_HOST}/$1 [NC,R=301,L] code from: https://serverfault.com/questions/570288/is-it-bad-to-redirect-http-to-https Is there a way to do this with GCE? I can only see in the docs forward rules but nothing about redirect. Thank you Leo

Google cloud storage Rate limiting in Google Cloud Storage

At the tope of every minute my code uploads between 20 to 40 files total (from multiple machines, about 5 files in parallel until they are all uploaded) to Google Cloud Storage. I frequently get 429 - Too Many Errors, like the following: java.io.IOException: Error inserting: bucket: mybucket, object: work/foo/hour/out/2015/08/21/1440191400003-e7ba2b0c-b71b-460a-9095-74f37661ae83/2015-08-21T20-00-00Z/ at com.google.cloud.hadoop.gcsio.GoogleCloudStorageImpl.wrapException(GoogleCloudStora

Google cloud storage gsutil failing on worker_thread.start()

I have been using gsutil for the past 3 months. I noticed an error that occurs randomly from time to time, e.g. sometimes once a week, sometimes once every two weeks and so on. From the log the error looks like it fails to start a new thread for executing the gsutil command: Process SyncManager-1: Traceback (most recent call last): File "/usr/lib64/python2.6/multiprocessing/process.py", line 232, in _bootstrap Process SyncManager-ProTraceback (most rTraceback (most re File "/usr/lib6 File

Google cloud storage Google cloud storage Rate limit error - why?

I'm have a problem with google cloud storage and gsutil when use -m option on file uploading. I want to upload ~400 files in multi thread mode (using -m option), but get errors: AccessDeniedException: 403 Rate limit exceeded. Please retry this request later. This command i use: gsutil -m -q rsync -R -c -d -e mydir gs://mybucket/mydir1/mydir I'm using this from gce instance but from custom service account user which has a next access scopes: Editor; Storage Admin; Storage Object Admin; Storag

Google cloud storage Google Cloud Storage ACL confusion

I'm the owner of a Google Cloud project, with a Google Cloud Storage bucket inside. All our backups are moved to this bucket. When I try to retrieve some of the backups, I get a permission denied. I'm not able to do anything but to list the bucket. When I try to reset the bucket ACL with gsutil acl ch -u xxx@yyy.zzz:FC gs://abc/** i get the following error. CommandException: Failed to set acl for gs://abc/1234.sql. Please ensure you have OWNER-role access to this resource. Which makes

Google cloud storage Google Cloud Storage confused about ACL/IAM and legacy permissions

I have a bucket whose contents I want to be publicly readable. However, I do not want the users to be able to list all of the contents by removing the keyname from the URL. For the sake of simplicity, please assume that I am setting these permissions via the console. Setting Storage Object Viewer for allUsers allows me to access the objects as well as list the contents. Setttings Storage Legacy Object Reader for allusers allows me to access the objects but not list the contents. This seems

Google cloud storage Objects do not inherit bucket permissions

In GCS storage, when adding permissions to a bucket (NOT the whole project; just a single bucket inside that project), you used to be able to set up the permissions of a bucket so that any NEW objects put in the bucket inherit the bucket's permissions. In the newest version of the GCS however, we have not been able to figure out how to do this. We can set permissions to a root bucket: { "email": "someuser@someaccount.iam.gserviceaccount.com", "entity": "someuser@someaccount.iam.gservice

Google cloud storage is gsutil cp secure for uloading sensitive data

I'm reading docs about how to use google cloud, particularly to store data on a bucket. I can see the gcloud scp command to upload file to a VM in a secure way (highlighted in the doc). To uload to a bucket, it's said to use gsutil cp Is this command secure ? If I want to upload sensitive data, do I have to take more precautions (and how)

Google cloud storage Google Cloud Storage : Google Cloud Storage JSON API to fix Access Not Configured error 403?

I am using an HTTP POST request to upload a file onto Google Cloud Storage. Request is: POST /upload/storage/v1beta1/b/myBucket/o?uploadType=media&name=testFile HTTP/1.1 Host: www.googleapis.com Content-length: 24 Content-type: text/plain Authorization: OAuth ba26.AHXXXXXXXXXVHXdVRAhBAHR_UXXXXXLV-MqPMXXXJwc <BINARY DATA - 24B> I am getting the following error as the response: { "error": { "errors": [ { "domain": "usageLimits", "reason": "accessNotConfigured", "m

Google cloud storage Request is too large

I am trying to upload a file around 7 GB to google cloud storage. I used the HttpRequest class to upload. I choosed "resumable" upload. I also set the readtimeout to be 20000000. If I upload smaller file, if works fine. For bigger file, such as 6GB. It returns: Exception in thread "main" com.google.api.client.http.HttpResponseException: 400 Bad Request Request is too large. at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1050) Is there any one who successfully uploade

Google cloud storage Google Cloud Storage can't retrieve buckets or contents of buckets

As of this morning I am unable to access my storage buckets. When I select the Google Cloud Storage tab on the navigation everything loads as expected, but rather than displaying my two buckets it displays a alert bar saying: We were unable to retrieve your buckets. Click to Retry As I'm aware of the link in to my bucket, i tried clicking on this and again, the page loads successfully, but I receive a new error message stating: Unable to retrieve objects. Click to Retry This is true

Google cloud storage google cloud storage public url, to use in img src

first, i have read, searched google and everywhere. but i cant find the solution to my problem. I can upload images to my google cloud storage bucket. then, i want to get the public URL, so i can save it to my mongodb, and access it later on for my site. I have a code here, written in GO. // profile func, that receive filename, and file name, self_link, err := google_storage.Upload(handler.Filename, file) // then after acquiring the self_link, i update the user db.UpdateUser(session_id.(stri

Google cloud storage Boto GCS authentication setup failure: no such file

I am trying to set Boto to work with GCS with Oauth2 authentication. Gsutil config -e begins the authentication process, but when it asks "What is the full path to your private key file?" I get OSError: No such file or directory. Why would this happen? It doesn't work with the .json version of the private key file either. I wish Boto for GCS didn't need a path to the private key file.

Google cloud storage In google cloud Compute engine persistent disk v/s cloud storage which should i use and why?

Hello all I am doing a start up and for cloud server I am using google cloud platform to launch my android app. Now I was reading through google's docs, but I can't figure out how I am going to put my scripts on google cloud because I came across two things Cloud Storage and other one was Compute engine's Persistent disks. I also google this question and it leads me here https://cloud.google.com/compute/docs/faq#pdgcs. I read it, but now I am more confused because initially I thought that when

Google cloud storage Restore Previous Versions of Files From Google Cloud Storage

We used GCS for our offsite backup and it has been working great. However, one of our office computers got infected with a randsomware virus over the weekend and encrypted all of our shared network files. The thing is our backup script does an rsync every evening and syncs up our shared network files which means that all the live versions of the files on GCS are now the encrypted randsomware files. We use versioning and keep 4 versions of all files. Is there a way to restore the version previou

Google cloud storage Azure Logic app with Google Cloud Storage connection

It was pretty straight forward setting up a connection to Google Drive since there was options for it in the Logic App Designer. But I can't find any similar options for connecting to Google Cloud Storage. Am I missing something or do I have to go with a Function App in Azure and write my own code for connecting to GCP? "actions": { "Create_file": { "type": "ApiConnection", "inputs": { "host": { "connection": { "name": "@parameters('$connections')['googledrive

Google cloud storage Read GCS file into dask dataframe

I want to read a csv file stored in Google Cloud Storage using dask dataframe. I have insalled gcsfs & dask in the conda env. on Windows import dask.dataframe as dd import gcsfs project_id = 'my-project' token_file = 'C:\\path\to\credentials.json' fs = gcsfs.GCSFileSystem(project=project_id) gcs_bucket_name = 'my_bucket' df = dd.read_csv('gs://'+gcs_bucket_name+'/my_file.csv',storage_options={'token': token_file, 'project': project_id}) I know I'm not providing the key file correctly a

Google cloud storage How best to store HTML alongside strings in Cloud Storage

I have a collection data of, and in each case there is chunk of HTML and a few strings, for example html: <div>html...</div>, name string: html chunk 1, date string: 01-01-1999, location string: London, UK. I would like to store this information together as a single cloud storage object. Specifically, I am using Google Cloud Storage. There are two ways I can think of doing this. One is to store the strings as custom metadata, and the HTML as the actual file contents. The other is to

Google cloud storage Why some buckets should not appear after a gsutil ls?

When I do gsutil ls -p myproject-id I get a list of buckets (in my case 2 buckets), which I expect to be the list of all my buckets in the project: gs://bucket-one/ gs://bucket-two/ But, if I do gsutil ls -p myproject-id gs://asixtythreecharacterlongnamebucket I actually get the elements of that long-named bucket: gs://asixtythreecharacterlongnamebucket/somefolder/ So my question is: why when I do a ls to the project I don't get in the results the long-named bucket? The only explanation it made

Google cloud storage Gsutil notification stopchannel giving failure

I followed this tutorial (Object Change Notification), it worked in watching step. But I can not stop watching with response: $ gsutil notification stopchannel my_channel-id my_resource-id Removing channel my_channel-id with resource identifier my_resource-id Failure: initializer for ctype 'char *' must be a str or list or tuple, not unicode. gsutil version: 4.6 EDIT: problem fixed I debug with -DD option $ gsutil -DD notification stopchannel my_channel-id my_resource-id It printed exc

Google cloud storage Are static sites hosted on google cloud storage accessable through https?

According to this post from 2014, https is not available to static sites on google cloud engine: https://stackoverflow.com/a/22767544/46799 Is this still the case? If so, are there any plans add this functionality? My site is hosted on GCS and I have a cname entry which maps my url to a bucket on GCS. I need to start providing access to the site through https now, am I out of luck?

Google cloud storage To encode or not to encode path parts in GCS?

Should path parts be encoded or not encoded when it comes to Google Cloud Storage? Encoding URI path parts says they should be encoded, but Object names talks about the possibility of naming GCS objects in a seemingly-hierarchical manner... So if I name an object abc/xyz, is the path to my object https://www.googleapis.com/storage/v1/b/example-bucket/o/abc%2fxyz or https://www.googleapis.com/storage/v1/b/example-bucket/o/abc/xyz? Which is it!? Somebody please help me with this confusion.

Google cloud storage Slow GCS upload speeds when using HTTP/2

Context: We are uploading files directly to GCS from a web-interface via AJAX calls using a parallel composite upload approach. While running tests in different scenarios we noticed that on some networks the upload speed is capped around 50Mbps even though on all of them the bandwidth is between 100Mbps and 1Gbps. We ran gsutils perfdiag inside one of the "troubled" networks in order to emulate the web-interface upload approach and got significantly better performances. When comparing the b

Google cloud storage Filtering GCS uris before job execution

I have a frequent use case I couldn't solve. Let's say I have a filepattern like gs://mybucket/mydata/*/files.json where * is supposed to match a date. Imagine I want to keep 251 dates (this is an example, let's say a big number of dates but without a meta-pattern to match them like 2019* or else). For now, I have two options : create a TextIO for every single file, which is overkill and fails almost everytime (graph too large) read ALL data and then filter it within my job from data : which

Google cloud storage How to create a public bucket in googleapis.com?

I found one public bucket on the web, which contains some files, which everybody can view or download. I want to create the same thing, where I can upload similar files. Here is how the permissions of this public bucket look like: Unfortunately I cannot create the same thing, searched the whole web, but haven't found any step by step explanation. Could anybody help me with this ?

Google cloud storage Restricting encrypt/decrypt permissions for a Cloud KMS key with CMEK and Cloud Storage

I have two storage buckets in one Google cloud project, say storage-project. One bucket with default encryption, and another bucket encrypted with a Customer Managed Key (CMEK) created in another project called security-project. I have granted the role Cloud KMS CryptoKey Encrypter/Decrypter to the Cloud Storage service account (service-xxxxxxxx@gs-project-accounts.iam.gserviceaccount.com) in the storage-project. I could successfully upload files to this storage bucket using a Google account wh

Google cloud storage How to access signed Google Storage url created with gsutil?

I think I just created a signed url with gsutil, but it never asked me for the private key's password, rather dropped to a command prompt after I issued the command without any output. I tried to access the bucket using commondatastorage.googleapis.com as the url, & the secret strings as username/password, but I get 403- forbidden errors. How do I access my buckets through my S3 browser app? Thanks.

  1    2   3   4   5   6  ... 下一页 共 6 页