Package cognitoidentity provides the API client, operations, and parameter types for Amazon Cognito Identity. Amazon Cognito Federated Identities is a web service that delivers scoped temporary credentials to mobile devices and other untrusted environments. It uniquely identifies a device and supplies the user with a consistent identity over the lifetime of an application. Using Amazon Cognito Federated Identities, you can enable authentication with one or more third-party identity providers (Facebook, Google, or Login with Amazon) or an Amazon Cognito user pool, and you can also choose to support unauthenticated access from your app. Cognito delivers a unique identifier for each user and acts as an OpenID token provider trusted by AWS Security Token Service (STS) to access temporary, limited-privilege AWS credentials. For a description of the authentication flow from the Amazon Cognito Developer Guide see Authentication Flow. For more information see Amazon Cognito Federated Identities.
Package main (doc.go) : This is a CLI tool to execute Google Apps Script (GAS) on a terminal. Will you want to develop GAS on your local PC? Generally, when we develop GAS, we have to login to Google using own browser and develop it on the Script Editor. Recently, I have wanted to have more convenient local-environment for developing GAS. So I created this "ggsrun". The main work is to execute GAS on local terminal and retrieve the results from Google. 1. Develops GAS using your terminal and text editor which got accustomed to using. 2. Executes GAS by giving values to your script. 3. Executes GAS made of CoffeeScript. 4. Downloads spreadsheet, document and presentation, while executes GAS, simultaneously. 5. Downloads files from Google Drive and Uploads files to Google Drive. 6. Downloads standalone script and bound script. 7. Downloads all files and folders in a specific folder. 8. Upload script files and create project as standalone script and container-bound script. 9. Update project. 10. Retrieve revision files of Google Docs and retrieve versions of projects. 11. Rearranges scripts in project. 12. Modifies Manifests in project. 13. Seach files in Google Drive using search query and regex. 14. Manage Permissions of files. 15. Get Drive Information. 16. ggsrun got to be able to be used by not only OAuth2, but also Service Account from v1.7.0. You can see the release page https://github.com/tanaikech/ggsrun/releases ggsrun uses Execution API, Web Apps and Drive API on Google. About how to install ggsrun, please check my github repository. https://github.com/tanaikech/ggsrun/ You can read the detail information there. --------------------------------------------------------------- # How to Execute Google Apps Script Using ggsrun When you have the configure file `ggsrun.cfg`, you can execute GAS. If you cannot find it, please download `client_secret.json` and run $ ggsrun auth In the case of using Execution API, $ ggsrun e1 -s sample.gs If you want to execute a function except for `main()` of default, you can use an option like `-f foo`. This command `exe1` can be used to execute a function on project. $ ggsrun e1 -f foo $ ggsrun e2 -s sample.gs At `e2`, you cannot select the executing function except for `main()` of default. `e1`, `e2` and `-s` mean using Execution API and GAS script file name, respectively. Sample codes which are shown here will be used Execution API. At this time, the executing function is `main()`, which is a default, in the script. In the case of using Web Apps, $ ggsrun w -s sample.gs -p password -u [ WebApps URL ] `w` and `-p` mean using Web Apps and password you set at the server side, respectively. Using `-u` it imports Web Apps URL like `-u https://script.google.com/macros/s/#####/exec`. --------------------------------------------------------------- Package main (ggsrun.go) : This file is included all commands and options. Package main (handler.go) : Handler for ggsrun Package main (init.go) : These methods are for reading and writing configuration file (ggsrun.cfg). Package main (materials.go) : Materials for ggsrun. Package main (oauth.go) : Get accesstoken using refreshtoken, and confirm condition of accesstoken. Package main (projectupdater.go) : These methods are for updating project. Package main (scriptrearrange.go) : These methods are for rearranging scripts in a project. Package main (sender.go) : These methods are for sending GAS scripts to Google Drive.
Package main (doc.go) : This is a CLI tool to execute Google Apps Script (GAS) on a terminal. Will you want to develop GAS on your local PC? Generally, when we develop GAS, we have to login to Google using own browser and develop it on the Script Editor. Recently, I have wanted to have more convenient local-environment for developing GAS. So I created this "ggsrun". The main work is to execute GAS on local terminal and retrieve the results from Google. 1. Develops GAS using your terminal and text editor which got accustomed to using. 2. Executes GAS by giving values to your script. 3. Executes GAS made of CoffeeScript. 4. Downloads spreadsheet, document and presentation, while executes GAS, simultaneously. 5. Downloads files from Google Drive and Uploads files to Google Drive. 6. Downloads standalone script and bound script. 7. Downloads all files and folders in a specific folder. 8. Upload script files and create project as standalone script and container-bound script. 9. Update project. 10. Retrieve revision files of Google Docs and retrieve versions of projects. 11. Rearranges scripts in project. 12. Modifies Manifests in project. 13. Seach files in Google Drive using search query and regex. 14. Manage Permissions of files. 15. Get Drive Information. 16. ggsrun got to be able to be used by not only OAuth2, but also Service Account from v1.7.0. You can see the release page https://github.com/tanaikech/ggsrun/releases ggsrun uses Execution API, Web Apps and Drive API on Google. About how to install ggsrun, please check my github repository. https://github.com/tanaikech/ggsrun/ You can read the detail information there. --------------------------------------------------------------- # How to Execute Google Apps Script Using ggsrun When you have the configure file `ggsrun.cfg`, you can execute GAS. If you cannot find it, please download `client_secret.json` and run $ ggsrun auth In the case of using Execution API, $ ggsrun e1 -s sample.gs If you want to execute a function except for `main()` of default, you can use an option like `-f foo`. This command `exe1` can be used to execute a function on project. $ ggsrun e1 -f foo $ ggsrun e2 -s sample.gs At `e2`, you cannot select the executing function except for `main()` of default. `e1`, `e2` and `-s` mean using Execution API and GAS script file name, respectively. Sample codes which are shown here will be used Execution API. At this time, the executing function is `main()`, which is a default, in the script. In the case of using Web Apps, $ ggsrun w -s sample.gs -p password -u [ WebApps URL ] `w` and `-p` mean using Web Apps and password you set at the server side, respectively. Using `-u` it imports Web Apps URL like `-u https://script.google.com/macros/s/#####/exec`. --------------------------------------------------------------- Package main (ggsrun.go) : This file is included all commands and options. Package main (handler.go) : Handler for ggsrun Package main (init.go) : These methods are for reading and writing configuration file (ggsrun.cfg). Package main (materials.go) : Materials for ggsrun. Package main (oauth.go) : Get accesstoken using refreshtoken, and confirm condition of accesstoken. Package main (projectupdater.go) : These methods are for updating project. Package main (scriptrearrange.go) : These methods are for rearranging scripts in a project. Package main (sender.go) : These methods are for sending GAS scripts to Google Drive.
Maestro is a SQL-centric tool for orchestrating BigQuery jobs. Maestro also supports data transfers from and to Google Cloud Storage (GCS) and relational databases (presently PostgresSQL and MySQL). Maestro is a "catalog" of SQL statements. Key feature of Maestro is the ability to infer dependencies by examining the SQL and without any additional configuration. Maestro can execute all tasks in correct order without a manually specified order (i.e. a "DAG"). Execution can be associated with a frequency (cadence) without requiring any cron or cron-like configuration. Maestro is an ever-running service (daemon). It uses PostgreSQL to store the SQL and all other configuration, state and history. The daemon takes great care to maintain all of its state in PostgreSQL so that it can be stopped or restarted without interrupting any in-progress jobs (in most cases). Maestro records all BigQuery job and other history and has a notion of users and groups which is useful for attributing costs and resource utilization to users and groups. Maestro has a basic web-based user interface implemented in React, though its API can also be used directly. Maestro can notify arbitrary applications of job completion via a simple HTTP request. Maestro also provides a Python client library for a more native Python experience. Maestro integrates with Google OAuth for authentication, Google Sheets for simple exports, Github (for SQL revision control) and Slack (for alerts and notifications). Maestro was designed with simplicity as one of its primary goals. It trades flexibility usually afforded by configurability in various languages for the transaprency and clarity achievable by leveraging the declarative nature of SQL. Maestro works best for environments where BigQuery is the primary store of all data for analyitcal purposes. E.g. the data may be periodically imported into BigQuery from various databases. Once imported, data may be subsequently summarized or transformed via a sequence of BigQuery jobs. The summarized data can then be exported to external databases/application for additional processing (e.g. SciPy) and possibly be imported back into BigQiery, and so on. Every step of this process can be orchestrated by Maestro without relying on any external scheduling facility such as cron. Below is the listing of all the key conepts with explanations. A table is the central object in Maestro. It always corresponds to a table in BigQuery. Maestro code and documentation use the verb "run" with respect to tables. To "run a table" means to perform whatever action is called for in its configuration and store the result in a BigQuery table. A table is (in most cases) defined by a BigQuery SQL statement. There are three kinds of tables in Maestro. A summary table is produced by executing a BigQuery SQL statement (a Query job). An import table is produced by executing SQL on an external database and importing the result into BigQuery. The SQL statement in this case it intentionally restricted to a primitive which supports only SELECT, FROM, WHERE and LIMIT. This is so as to discourage the users from running a complex and taxing query on the database server. The main reason for this SQL statement is to filter out or transform columns, any other processing is best done subsequently in BigQuery. This is a table whose data comes from GCS. The import is triggered via the Maestro API. Such tables are generally used when BigQuery data needs to be processed by an external tool, e.g. SciPy, etc. A job is a BigQuery job. BigQquery has three types of jobs: query, extract and load. All three types are used in Maestro. These details are internal but should be familiar to developers. A BigQquery query job is executed as part of running a table. A BigQuery extract job is executed as part of running a table, after the query job is complete. It results in one or more extract files in GCS. Maestro provides signed URLs to the GCS files so that external tools require no authentication to access the data. This is also facilitated via the Maestro pythonlib. A BigQuery load job is executed as part of running an import table. It is the last step of the import, after the external database table data has been copied to GCS. A run is a complex process which happens periodically, according to a frequency. For example if a daily frequency is defined, then Maestro will construct a run once per day, selecting all tables (including import tables) assigned to this frequency, computing the dependency graph and creating jobs for each table. The jobs are then executed in correct order based on the position in the graph and the number of workers available. Maestro will also assign priority based on the number of child dependencies a table has, thus running the most "important" tables first. PostgreSQL 9.6 or later is required to run Maestro. Building a "production" binary, i.e. with all assets included in the binary itself requires Webpack. Webpack is not necessary for "development" mode which uses Babel for transpilation. Download and compile Maestro with "go get github.com/voxmedia/maestro". (Note that this will create a $GOPATH/bin/maestro binary, which is not very useful, you can delete it). From here cd $GOPATH/src/github.com/voxmedia/maestro and go build. You should now have a "maestro" binary in this directory. You can also create a "production" binary by running "make build". This will combine all the javascript code into a single file and pack it and all other assets into the maestro binary itself, so that to deploy you only need the binary and no other files. Create a PostgreSQL database named "maestro". If you name it something other than that, you will need to provide that name to Maestro via the -db-connect flag which defaults to "host=/var/run/postgresql dbname=maestro sslmode=disable", which should work on most Linux distros. On MacOS the Postgres socket is likely to be in "/private/tmp" and one way to address this is to run "ln -s /private/tmp /var/run/postgresql" Maestro connects to many services and needs credentials for all of them. These credentials are stored in the database, all encrypted using the same shared secret which must be specified on the command line via the -secret argument. The -secret argument is meant mostly for development, in production it is much more secure to use the -secretpath option pointing to the location of a file containing the secret. From the Google Cloud perspective, it is best to create a project entirely dedicated to Maestro, with BigQuery and GCS API's enabled, then create a Service Account (in IAM) dedicated to Maestro, as well as OAuth credentials. The service account will need BigQuery Editor, Job User and Storage Object Admin roles. Run Maestro like so: ./maestro -secret=whatever where "whatever" is the shared secret you invent and need to remember. You should now be able to visit the Maestro UI, by default it is at http://localhost:3000 When you click on the log-in link, since at this point Maestro has no OAuth configuration, you will be presented with a form asking for the relevant info, which you will need to provide. You should then be redirected to the Google OAuth login page. From here on the configuration is stored in the database in encrypted form. As the first user of this Maestro instance, you are automatically marked as "admin", which means you can perform any action. As an admin, you should see the "Admin" menu in the upper right. Click on it and select the "Credentials" option. You now need to populate the credentials. The BigQuery, default dataset and GCS bucket are required, while Git and Slack are optional, but highly recommended. Note that the BigQuery dataset and the GCS bucket are not created by Maestro, you need to create those manually. The GCS bucket is used for data exports, and it is generally a good idea to set the data in it to expire after several days or whatever works for you. If you need to import data from external databases, you can add those credentials under the Admin / Databases menu. You may want to create a frequency (also under Admin menu). A frequency is how periodic jobs are scheduled in Maestro. It is defined by a period and an offset. The period is passed to time.Truncate() function, and if the result is 0, this is when a run is triggered. The offset is an offset into the period. E.g. to define a frequency that start a run at 4am UTC, you need to specify a period of 86400 and an offset of 14400. Note that Maestro needs to be restarted after these configuration changes (this will be fixed later). At this point you should be able to create a summary table with some simple SQL, e.g. "SELECT 'hello' AS world", save it and run it. If it executes correctly, you should be able to see this new table in the BigQuery UI.
Package cognitoidentity provides the client and types for making API requests to Amazon Cognito Identity. Amazon Cognito is a web service that delivers scoped temporary credentials to mobile devices and other untrusted environments. Amazon Cognito uniquely identifies a device and supplies the user with a consistent identity over the lifetime of an application. Using Amazon Cognito, you can enable authentication with one or more third-party identity providers (Facebook, Google, or Login with Amazon), and you can also choose to support unauthenticated access from your app. Cognito delivers a unique identifier for each user and acts as an OpenID token provider trusted by AWS Security Token Service (STS) to access temporary, limited-privilege AWS credentials. To provide end-user credentials, first make an unsigned call to GetId. If the end user is authenticated with one of the supported identity providers, set the Logins map with the identity provider token. GetId returns a unique identifier for the user. Next, make an unsigned call to GetCredentialsForIdentity. This call expects the same Logins map as the GetId call, as well as the IdentityID originally returned by GetId. Assuming your identity pool has been configured via the SetIdentityPoolRoles operation, GetCredentialsForIdentity will return AWS credentials for your use. If your pool has not been configured with SetIdentityPoolRoles, or if you want to follow legacy flow, make an unsigned call to GetOpenIdToken, which returns the OpenID token necessary to call STS and retrieve AWS credentials. This call expects the same Logins map as the GetId call, as well as the IdentityID originally returned by GetId. The token returned by GetOpenIdToken can be passed to the STS operation AssumeRoleWithWebIdentity (http://docs.aws.amazon.com/STS/latest/APIReference/API_AssumeRoleWithWebIdentity.html) to retrieve AWS credentials. If you want to use Amazon Cognito in an Android, iOS, or Unity application, you will probably want to make API calls via the AWS Mobile SDK. To learn more, see the AWS Mobile SDK Developer Guide (http://docs.aws.amazon.com/mobile/index.html). See https://docs.aws.amazon.com/goto/WebAPI/cognito-identity-2014-06-30 for more information on this service. See cognitoidentity package documentation for more information. https://docs.aws.amazon.com/sdk-for-go/api/service/cognitoidentity/ To Amazon Cognito Identity with the SDK use the New function to create a new service client. With that client you can make API requests to the service. These clients are safe to use concurrently. See the SDK's documentation for more information on how to use the SDK. https://docs.aws.amazon.com/sdk-for-go/api/ See aws.Config documentation for more information on configuring SDK clients. https://docs.aws.amazon.com/sdk-for-go/api/aws/#Config See the Amazon Cognito Identity client CognitoIdentity for more information on creating client for this service. https://docs.aws.amazon.com/sdk-for-go/api/service/cognitoidentity/#New
Package function is a Google Cloud Function receiving webhook events from DNSimple (https://dnsimple.com/webhooks). It reacts to `dnssec.rotation_start` and `dnssec.rotation_complete` events and passes the new DS record on to DK Hostmaster via their DS Update protocol (https://github.com/DK-Hostmaster/dsu-service-specification). The cloud function needs to be configured through environment variables. The `TOKEN` environment variable is the access token that should be added as URL query parameter to the trigger URL (e.g. `?token=abcdefeghijklmnopqrstuvxyz0123456789`). The `DNSIMPLE_TOKEN` environment variable is a DNSimple API token that is used to retrieve DS records from DNsimple. For the domains in your DNSimple account that you would like this cloud function to update in DK Hostmaster you need to add three environment variables. They should all be prefix with the Domain ID from DNSimple (e.g. 123456). `123456_DOMAIN`: the (apex) domain name in DK Hostmaster. `123456_USERID`: the DK Hostmaster handle you use to login to their self service. `123456_PASSWORD`: the DK Hostmaster password you use to login to their self service.
Package cognitoidentity provides the client and types for making API requests to Amazon Cognito Identity. Amazon Cognito is a web service that delivers scoped temporary credentials to mobile devices and other untrusted environments. Amazon Cognito uniquely identifies a device and supplies the user with a consistent identity over the lifetime of an application. Using Amazon Cognito, you can enable authentication with one or more third-party identity providers (Facebook, Google, or Login with Amazon), and you can also choose to support unauthenticated access from your app. Cognito delivers a unique identifier for each user and acts as an OpenID token provider trusted by AWS Security Token Service (STS) to access temporary, limited-privilege AWS credentials. To provide end-user credentials, first make an unsigned call to GetId. If the end user is authenticated with one of the supported identity providers, set the Logins map with the identity provider token. GetId returns a unique identifier for the user. Next, make an unsigned call to GetCredentialsForIdentity. This call expects the same Logins map as the GetId call, as well as the IdentityID originally returned by GetId. Assuming your identity pool has been configured via the SetIdentityPoolRoles operation, GetCredentialsForIdentity will return AWS credentials for your use. If your pool has not been configured with SetIdentityPoolRoles, or if you want to follow legacy flow, make an unsigned call to GetOpenIdToken, which returns the OpenID token necessary to call STS and retrieve AWS credentials. This call expects the same Logins map as the GetId call, as well as the IdentityID originally returned by GetId. The token returned by GetOpenIdToken can be passed to the STS operation AssumeRoleWithWebIdentity (http://docs.aws.amazon.com/STS/latest/APIReference/API_AssumeRoleWithWebIdentity.html) to retrieve AWS credentials. If you want to use Amazon Cognito in an Android, iOS, or Unity application, you will probably want to make API calls via the AWS Mobile SDK. To learn more, see the AWS Mobile SDK Developer Guide (http://docs.aws.amazon.com/mobile/index.html). See https://docs.aws.amazon.com/goto/WebAPI/cognito-identity-2014-06-30 for more information on this service. See cognitoidentity package documentation for more information. https://docs.aws.amazon.com/sdk-for-go/api/service/cognitoidentity/ To Amazon Cognito Identity with the SDK use the New function to create a new service client. With that client you can make API requests to the service. These clients are safe to use concurrently. See the SDK's documentation for more information on how to use the SDK. https://docs.aws.amazon.com/sdk-for-go/api/ See aws.Config documentation for more information on configuring SDK clients. https://docs.aws.amazon.com/sdk-for-go/api/aws/#Config See the Amazon Cognito Identity client CognitoIdentity for more information on creating client for this service. https://docs.aws.amazon.com/sdk-for-go/api/service/cognitoidentity/#New
Package google_oauth_handler transparently handles OAuth authentication with Google. Create an Authenticator and then insert it as middleware in front of any resources you want to protect behind Google login, via authenticator.Handle. Handle will call the next middleware with (w, r, *Token), which you can use to make requests to the Google API. The Authenticator handles the OAuth workflow for you, redirecting users to Google, handling the callback and setting an encrypted cookie in the user's browser.
Package function is a Google Cloud Function receiving webhook events from DNSimple (https://dnsimple.com/webhooks). It reacts to `dnssec.rotation_start` and `dnssec.rotation_complete` events and passes the new DS record on to Punktum.dk via their DS Update protocol (https://github.com/Punktum-dk/dsu-service-specification). The cloud function needs to be configured through environment variables. The `TOKEN` environment variable is the access token that should be added as URL query parameter to the trigger URL (e.g. `?token=abcdefeghijklmnopqrstuvxyz0123456789`). The `DNSIMPLE_TOKEN` environment variable is a DNSimple API token that is used to retrieve DS records from DNsimple. For the domains in your DNSimple account that you would like this cloud function to update in Punktum.dk you need to add three environment variables. They should all be prefix with the Domain ID from DNSimple (e.g. 123456). `123456_DOMAIN`: the (apex) domain name in Punktum.dk. `123456_USERID`: the Punktum.dk handle you use to login to their self service. `123456_PASSWORD`: the Punktum.dk password you use to login to their self service.