
Research
2025 Report: Destructive Malware in Open Source Packages
Destructive malware is rising across open source registries, using delays and kill switches to wipe code, break builds, and disrupt CI/CD.
@google-cloud/cloud-sql-connector
Advanced tools
A JavaScript library for connecting securely to your Cloud SQL instances
The Cloud SQL Node.js Connector is a Cloud SQL connector designed for use with the Node.js runtime. Using a Cloud SQL connector provides a native alternative to the Cloud SQL Auth Proxy while providing the following benefits:
The Cloud SQL Node.js Connector is a package to be used alongside a database driver. Currently supported drivers are:
You can install the library using npm install:
npm install @google-cloud/cloud-sql-connector
This library requires the following to successfully make Cloud SQL Connections:
This library uses the Application Default Credentials (ADC) strategy for resolving credentials. Please see these instructions for how to set your ADC (Google Cloud Application vs Local Development, IAM user vs service account credentials), or consult the Node.js google-auth-library.
The connector package is meant to be used alongside a database driver, in the following examples you can see how to create a new connector and get valid options that can then be used when starting a new connection.
For even more examples, check the examples/ folder.
Here is how to start a new
pg connection pool.
import pg from 'pg';
import {Connector} from '@google-cloud/cloud-sql-connector';
const {Pool} = pg;
const connector = new Connector();
const clientOpts = await connector.getOptions({
instanceConnectionName: 'my-project:region:my-instance',
ipType: 'PUBLIC',
});
const pool = new Pool({
...clientOpts,
user: 'my-user',
password: 'my-password',
database: 'db-name',
max: 5,
});
const {rows} = await pool.query('SELECT NOW()');
console.table(rows); // prints returned time value from server
await pool.end();
connector.close();
Here is how to start a new
mysql2 connection pool.
import mysql from 'mysql2/promise';
import {Connector} from '@google-cloud/cloud-sql-connector';
const connector = new Connector();
const clientOpts = await connector.getOptions({
instanceConnectionName: 'my-project:region:my-instance',
ipType: 'PUBLIC',
});
const pool = await mysql.createPool({
...clientOpts,
user: 'my-user',
password: 'my-password',
database: 'db-name',
});
const conn = await pool.getConnection();
const [result] = await conn.query(`SELECT NOW();`);
console.table(result); // prints returned time value from server
await pool.end();
connector.close();
Here is how to start a new
tedious connection.
const {Connection, Request} = require('tedious');
const {Connector} = require('@google-cloud/cloud-sql-connector');
const connector = new Connector();
const clientOpts = await connector.getTediousOptions({
instanceConnectionName: process.env.SQLSERVER_CONNECTION_NAME,
ipType: 'PUBLIC',
});
const connection = new Connection({
// Please note that the `server` property here is not used and is only defined
// due to a bug in the tedious driver (ref: https://github.com/tediousjs/tedious/issues/1541)
// With that in mind, do not try to change this value since it will have no
// impact in how the connector works, this README will be updated to remove
// this property declaration as soon as the tedious driver bug is fixed
server: '0.0.0.0',
authentication: {
type: 'default',
options: {
userName: 'my-user',
password: 'my-password',
},
},
options: {
...clientOpts,
// Please note that the `port` property here is not used and is only defined
// due to a bug in the tedious driver (ref: https://github.com/tediousjs/tedious/issues/1541)
// With that in mind, do not try to change this value since it will have no
// impact in how the connector works, this README will be updated to remove
// this property declaration as soon as the tedious driver bug is fixed
port: 9999,
database: 'my-database',
},
});
connection.connect(err => {
if (err) {
throw err;
}
let result;
const req = new Request('SELECT GETUTCDATE()', err => {
if (err) {
throw err;
}
});
req.on('error', err => {
throw err;
});
req.on('row', columns => {
result = columns;
});
req.on('requestCompleted', () => {
console.table(result);
});
connection.execSql(req);
});
connection.close();
connector.close();
Another possible way to use the Cloud SQL Node.js Connector is by creating a
local proxy server that tunnels to the secured connection established
using the Connector.startLocalProxy() method instead of
Connector.getOptions().
[!NOTE]
The
startLocalProxy()method is currently only supported for MySQL and PostgreSQL as it uses a Unix domain socket which SQL Server does not currently support.
This alternative approach enables usage of the Connector library with unsupported drivers such as Prisma. Here is an example on how to use it with its PostgreSQL driver:
import {Connector} from '@google-cloud/cloud-sql-connector';
import {PrismaClient} from '@prisma/client';
const connector = new Connector();
await connector.startLocalProxy({
instanceConnectionName: 'my-project:us-east1:my-instance',
listenOptions: { path: '.s.PGSQL.5432' },
});
const hostPath = process.cwd();
const datasourceUrl =
`postgresql://my-user:password@localhost/dbName?host=${hostPath}`;
const prisma = new PrismaClient({ datasourceUrl });
connector.close();
await prisma.$disconnect();
For examples on each of the supported Cloud SQL databases consult our Prisma samples.
The Cloud SQL Connector for Node.js can be used to connect to Cloud SQL
instances using both public and private IP addresses, as well as
Private Service Connect
(PSC). Specifying which IP address type to connect to can be configured within
getOptions through the ipType argument.
By default, connections will be configured to 'PUBLIC' and connect over
public IP, to configure connections to use an instance's private IP,
use 'PRIVATE' for ipType as follows:
Note: If specifying Private IP or Private Service Connect, your application must be attached to the proper VPC network to connect to your Cloud SQL instance. For most applications this will require the use of a VPC Connector.
const clientOpts = await connector.getOptions({
instanceConnectionName: 'my-project:region:my-instance',
ipType: 'PRIVATE',
});
const clientOpts = await connector.getOptions({
instanceConnectionName: 'my-project:region:my-instance',
ipType: 'PSC',
});
IpAddressTypes in TypeScriptimport {Connector, IpAddressTypes} from '@google-cloud/cloud-sql-connector';
const clientOpts = await connector.getOptions({
instanceConnectionName: 'my-project:region:my-instance',
ipType: IpAddressTypes.PSC,
});
Connections using Automatic IAM database authentication are supported when using Postgres or MySQL drivers.
Make sure to configure your Cloud SQL Instance to allow IAM authentication and add an IAM database user.
A Connector can be configured to connect to a Cloud SQL instance using
automatic IAM database authentication with getOptions through the
authType argument.
const clientOpts = await connector.getOptions({
instanceConnectionName: 'my-project:region:my-instance',
authType: 'IAM',
});
When configuring a connection for IAM authentication, the password argument
can be omitted and the user argument should be formatted as follows:
Postgres: For an IAM user account, this is the user's email address. For a service account, it is the service account's email without the
.gserviceaccount.comdomain suffix.MySQL: For an IAM user account, this is the user's email address, without the
@or domain name. For example, fortest-user@gmail.com, set theuserfield totest-user. For a service account, this is the service account's email address without the@project-id.iam.gserviceaccount.comsuffix.
Examples using the test-sa@test-project.iam.gserviceaccount.com
service account to connect can be found below.
import pg from 'pg';
import {Connector} from '@google-cloud/cloud-sql-connector';
const {Pool} = pg;
const connector = new Connector();
const clientOpts = await connector.getOptions({
instanceConnectionName: 'my-project:region:my-instance',
authType: 'IAM',
});
const pool = new Pool({
...clientOpts,
user: 'test-sa@test-project.iam',
database: 'db-name',
max: 5,
});
const {rows} = await pool.query('SELECT NOW()');
console.table(rows); // prints returned time value from server
await pool.end();
connector.close();
import mysql from 'mysql2/promise';
import {Connector} from '@google-cloud/cloud-sql-connector';
const connector = new Connector();
const clientOpts = await connector.getOptions({
instanceConnectionName: 'my-project:region:my-instance',
authType: 'IAM',
});
const pool = await mysql.createPool({
...clientOpts,
user: 'test-sa',
database: 'db-name',
});
const conn = await pool.getConnection();
const [result] = await conn.query(`SELECT NOW();`);
console.table(result); // prints returned time value from server
await pool.end();
connector.close();
AuthTypes in TypeScriptFor TypeScript users, the AuthTypes type can be imported and used directly
for automatic IAM database authentication.
import {AuthTypes, Connector} from '@google-cloud/cloud-sql-connector';
const clientOpts = await connector.getOptions({
instanceConnectionName: 'my-project:region:my-instance',
authType: AuthTypes.IAM,
});
Google Auth Library: Node.js Client CredentialsOne can use google-auth-library credentials
with this library by providing an AuthClient or GoogleAuth instance to the Connector.
npm install google-auth-library
import {GoogleAuth} from 'google-auth-library';
import {Connector} from '@google-cloud/cloud-sql-connector';
const connector = new Connector({
auth: new GoogleAuth({
scopes: ['https://www.googleapis.com/auth/sqlservice.admin']
}),
});
This can be useful when configuring credentials that differ from
Application Default Credentials. See the documentation
on the google-auth-library for more information.
The custom Google Auth Library auth property can also be used to set
auth-specific properties such as a custom quota project. Following up from the
previous example, here's how you can set a custom quota project using a custom
auth credential:
import {GoogleAuth} from 'google-auth-library';
import {Connector} from '@google-cloud/cloud-sql-connector';
const connector = new Connector({
auth: new GoogleAuth({
clientOptions: {
quotaProjectId: '<custom quota project>',
},
}),
});
It is possible to change some of the library default behavior via environment variables. Here is a quick reference to supported values and their effect:
GOOGLE_APPLICATION_CREDENTIALS: If defined the connector will use this
file as a custom credential files to authenticate to Cloud SQL APIs. Should be
a path to a JSON file. You can
find more on how to get a valid credentials file here.GOOGLE_CLOUD_QUOTA_PROJECT: Used to set a custom quota project to Cloud SQL
APIs when defined.The connector can be configured to use DNS to look up an instance. Use a DNS name managed by Cloud SQL Advanced Disaster Recovery, or a domain name that you manage.
Advanced Disaster Recovery creates geographically distributed replicas of your Cloud SQL database instance. When you perform switchover or failover on the database instance, the connector will gracefully disconnect from the old primary instance and reconnect to the new primary instance.
Follow the instructions in Connect using Write Endpoint to get the write endpoint DNS name for your primary instance. Then, use this write endpoint DNS name to configure the connector.
The connector may be configured to use DNS that you define as well.
Add a DNS TXT record for the Cloud SQL instance to a private DNS server or a private Google Cloud DNS Zone used by your application.
Note: You are strongly discouraged from adding DNS records for your Cloud SQL instances to a public DNS server. This would allow anyone on the internet to discover the Cloud SQL instance name.
For example: suppose you wanted to use the domain name
prod-db.mycompany.example.com to connect to your database instance
my-project:region:my-instance. You would create the following DNS record:
TXTprod-db.mycompany.example.com – This is the domain name used by the applicationmy-project:region:my-instance – This is the instance nameConfigure the connector as described above, replacing the connector ID with the DNS name.
Adapting the MySQL + database/sql example above:
import mysql from 'mysql2/promise';
import {Connector} from '@google-cloud/cloud-sql-connector';
const connector = new Connector();
const clientOpts = await connector.getOptions({
domainName: 'prod-db.mycompany.example.com',
ipType: 'PUBLIC',
});
const pool = await mysql.createPool({
...clientOpts,
user: 'my-user',
password: 'my-password',
database: 'db-name',
});
const conn = await pool.getConnection();
const [result] = await conn.query(`SELECT NOW();`);
console.table(result); // prints returned time value from server
await pool.end();
connector.close();
For example: suppose application is configured to connect using the
domain name prod-db.mycompany.example.com. Initially the private DNS
zone has a TXT record with the value my-project:region:my-instance. The
application establishes connections to the my-project:region:my-instance
Cloud SQL instance. Configure the connector using the domainName option:
Then, to reconfigure the application to use a different database
instance, change the value of the prod-db.mycompany.example.com DNS record
from my-project:region:my-instance to my-project:other-region:my-instance-2
The connector inside the application detects the change to this
DNS record. Now, when the application connects to its database using the
domain name prod-db.mycompany.example.com, it will connect to the
my-project:other-region:my-instance-2 Cloud SQL instance.
The connector will automatically close all existing connections to
my-project:region:my-instance. This will force the connection pools to
establish new connections. Also, it may cause database queries in progress
to fail.
The connector will poll for changes to the DNS name every 30 seconds by default.
You may configure the frequency of the connections using the Connector's
failoverPeriod option. When this is set to 0, the connector will disable
polling and only check if the DNS record changed when it is creating a new
connection.
This project uses semantic versioning, and uses the following lifecycle regarding support for a major version:
Active - Active versions get all new features and security fixes (that wouldn’t otherwise introduce a breaking change). New major versions are guaranteed to be "active" for a minimum of 1 year.
Deprecated - Deprecated versions continue to receive security and critical bug fixes, but do not receive new features. Deprecated versions will be supported for 1 year.
Unsupported - Any major version that has been deprecated for >=1 year is considered unsupported.
Our client libraries follow the Node.js release schedule. Libraries are compatible with all current active and maintenance versions of Node.js. If you are using an end-of-life version of Node.js, we recommend that you update as soon as possible to an actively supported LTS version.
Google's client libraries support legacy versions of Node.js runtimes on a best-efforts basis with the following warnings:
This project aims for a release on at least a monthly basis. If no new features or fixes have been added, a new PATCH version with the latest dependencies is released.
We welcome outside contributions. Please see our Contributing Guide for details on how best to contribute.
Apache Version 2.0
See LICENSE
mysql2 is a popular MySQL client for Node.js that supports both callbacks and promises. While it provides robust functionality for connecting to MySQL databases, it does not offer built-in support for Google Cloud SQL's specific authentication and connection management features like @google-cloud/cloud-sql-connector.
pg is a PostgreSQL client for Node.js. It offers a wide range of features for connecting to and interacting with PostgreSQL databases. Similar to mysql2, it does not include specific support for Google Cloud SQL's authentication and connection management, which @google-cloud/cloud-sql-connector provides.
knex is a SQL query builder for Node.js that supports multiple database types, including MySQL and PostgreSQL. It provides a flexible and powerful way to build and execute SQL queries. However, it lacks the specialized connection management and authentication features for Google Cloud SQL that @google-cloud/cloud-sql-connector offers.
FAQs
A JavaScript library for connecting securely to your Cloud SQL instances
The npm package @google-cloud/cloud-sql-connector receives a total of 474,150 weekly downloads. As such, @google-cloud/cloud-sql-connector popularity was classified as popular.
We found that @google-cloud/cloud-sql-connector demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 2 open source maintainers collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Research
Destructive malware is rising across open source registries, using delays and kill switches to wipe code, break builds, and disrupt CI/CD.

Security News
Socket CTO Ahmad Nassri shares practical AI coding techniques, tools, and team workflows, plus what still feels noisy and why shipping remains human-led.

Research
/Security News
A five-month operation turned 27 npm packages into durable hosting for browser-run lures that mimic document-sharing portals and Microsoft sign-in, targeting 25 organizations across manufacturing, industrial automation, plastics, and healthcare for credential theft.