31 skills found · Page 1 of 2
olizimmermann / S3dnsFind S3 AWS/GCP/Azure buckets while surfing. S3DNS acts as DNS server, follows CNAMEs and matches any bucket pattern
leerob / Nextjs Gcp StorageExample Next.js app to upload photos to a GCP storage bucket.
tjs-z / Wallpaperstired of paying $50 sub to a greedy b*tch and a 'barely reviewable' lousy app for wallpapers? have them for free since they served images for free on an open gcp bucket.
viltgroup / Bucket RestorePoint in time restores for buckets in Amazon Web Services (AWS) S3 and Google Cloud Platform (GCP) Storage.
alfonsof / Google Cloud Python ExamplesPython examples on Google Cloud Platform (GCP). How to manage Compute Engine VM instances, Cloud Storage buckets, etc.
SweetOps / Terraform Google Storage BucketTerraform module : GCP : for creation storage buckets
vinniepsychosis / ETL Mage GCPAn ETL Pipeline built over GCP and orchestrated by Mage, which involves Extracting Data from GCS Bucket, building Dimensional Model (Star Schema), loading the Data into BigQuery and a Looker Dashboard for further analysis.
AdnanHodzic / Wp Cloud RunUltimate WordPress setup on (GCP) Cloud Run, Cloud SQL & GCS buckets
ageapps / K8s Storage Bucketsk8s-storage-buckets using AWS,GCP and Azure's distributed filesystems
lightspin-tech / Red Bucket GcpNo description available
abinmn / Gcp Storage Bucket ActionNo description available
NishanthSingaraju / GitlinkTurn your AWS and GCP buckets into your git large file storage.
clglavan / Nvd ScrapperPull data from the national vulnerability database and push it to a GCP bucket
tweag / Terraform Gcp Cdn BucketA Google Storage Bucket + CDN configuration
leddcode / OculusOculus is a Domain OSINT Tool used to discover environments, directories, and subdomains of a particular domain. Additionally, it is useful for searching S3 Buckets, Azure Blob Containers, Firebase DBs, GCP Buckets, leaked email addresses and MX records of a domain.
kjy / ArchitectingWithGCP Fundamentals Course2 EssentialCloudInfrastructureFoundationConsole and Cloud Shell. Access Google Cloud Platform. Create a Cloud Storage Bucket using the GCP console and Cloud Shell. Understand shell features. Infrastructure Preview. Use Cloud Launcher to build a Jenkins Continuous integration environment. Manage the service from the Jenkins UI. Administer the service from the Virtual Machine host through SSH. Virtual Networking. Understand network layout and placing instances in various locations and establish communications between virtual machines. Create an auto-mode network, a custom-mode network, and associated subnetworks. Compare connectivity in the various types of networks. Create routes and firewall rules using IP addresses and tags to enable connectivity. Convert an auto-mode network to a custom-mode network. Create, expand, and delete subnetworks. Bastion Host. Create an application web server to represent a service provided to an internal corporate audience. Prevent the web server from access to or from the internet. Create a maintenance server, called a Bastion Host, to gain access to and verify internal connectivity to the application server. Virtual Machines. Create a utility virtual machine for administration purposes, a standard VM and a custom VM. Launch both Windows and Linux VMs and deleted VMs. Working with Virtual Machines. Create a customized virtual machine instance, using an n1-standard-1 machine type that includes a 10 GB boot disk, 1 virtual CPU (vCPU), and 3.75 GB of RAM. Machine type runs Debian Linux by default. Install base software (a headless JRE) and an application software (a Minecraft game server). Attach a high-performance 50-GB persistent solid-state drive (SSD) to the instance. Minecraft server can support up to 50 players. Reserve a static external IP so the address would remain consistent. Verified availability of the gaming server online. Set up a backup system to back up the server’s data to a Cloud Storage bucket and test the backup system. Automate backups using cron. Set up maintenance scripts using metadata for graceful startup and shutdown of the server.
orsinium-labs / GcserveServe files from a GCP bucket
Romboost-Repo / PipeLoggerNo description available
Bibhuti5 / Potato Disease ClassificationPotato Disease Classification Setup for Python: Install Python (Setup instructions) Install Python packages pip3 install -r training/requirements.txt pip3 install -r api/requirements.txt Install Tensorflow Serving (Setup instructions) Setup for ReactJS Install Nodejs (Setup instructions) Install NPM (Setup instructions) Install dependencies cd frontend npm install --from-lock-json npm audit fix Copy .env.example as .env. Change API url in .env. Setup for React-Native app Initial setup for React-Native app(Setup instructions) Install dependencies cd mobile-app yarn install cd ios && pod install && cd ../ Copy .env.example as .env. Change API url in .env. Training the Model Download the data from kaggle. Only keep folders related to Potatoes. Run Jupyter Notebook in Browser. jupyter notebook Open training/potato-disease-training.ipynb in Jupyter Notebook. In cell #2, update the path to dataset. Run all the Cells one by one. Copy the model generated and save it with the version number in the models folder. Running the API Using FastAPI Get inside api folder cd api Run the FastAPI Server using uvicorn uvicorn main:app --reload --host 0.0.0.0 Your API is now running at 0.0.0.0:8000 Using FastAPI & TF Serve Get inside api folder cd api Copy the models.config.example as models.config and update the paths in file. Run the TF Serve (Update config file path below) docker run -t --rm -p 8501:8501 -v C:/Code/potato-disease-classification:/potato-disease-classification tensorflow/serving --rest_api_port=8501 --model_config_file=/potato-disease-classification/models.config Run the FastAPI Server using uvicorn For this you can directly run it from your main.py or main-tf-serving.py using pycharm run option (as shown in the video tutorial) OR you can run it from command prompt as shown below, uvicorn main-tf-serving:app --reload --host 0.0.0.0 Your API is now running at 0.0.0.0:8000 Running the Frontend Get inside api folder cd frontend Copy the .env.example as .env and update REACT_APP_API_URL to API URL if needed. Run the frontend npm run start Running the app Get inside mobile-app folder cd mobile-app Copy the .env.example as .env and update URL to API URL if needed. Run the app (android/iOS) npm run android or npm run ios Creating the TF Lite Model Run Jupyter Notebook in Browser. jupyter notebook Open training/tf-lite-converter.ipynb in Jupyter Notebook. In cell #2, update the path to dataset. Run all the Cells one by one. Model would be saved in tf-lite-models folder. Deploying the TF Lite on GCP Create a GCP account. Create a Project on GCP (Keep note of the project id). Create a GCP bucket. Upload the tf-lite model generate in the bucket in the path models/potato-model.tflite. Install Google Cloud SDK (Setup instructions). Authenticate with Google Cloud SDK. gcloud auth login Run the deployment script. cd gcp gcloud functions deploy predict_lite --runtime python38 --trigger-http --memory 512 --project project_id Your model is now deployed. Use Postman to test the GCF using the Trigger URL. Inspiration: https://cloud.google.com/blog/products/ai-machine-learning/how-to-serve-deep-learning-models-using-tensorflow-2-0-with-cloud-functions Deploying the TF Model (.h5) on GCP Create a GCP account. Create a Project on GCP (Keep note of the project id). Create a GCP bucket. Upload the tf .h5 model generate in the bucket in the path models/potato-model.h5. Install Google Cloud SDK (Setup instructions). Authenticate with Google Cloud SDK. gcloud auth login Run the deployment script. cd gcp gcloud functions deploy predict --runtime python38 --trigger-http --memory 512 --project project_id Your model is now deployed. Use Postman to test the GCF using the Trigger URL. Inspiration: https://cloud.google.com/blog/products/ai-machine-learning/how-to-serve-deep-learning-models-using-tensorflow-2-0-with-cloud-functions
glytching / Terraform Gcp Storage BucketA Terraform module for creating and managing GCS buckets