I'm struggling to use JdbcIO with Apache Beam 2.0 (Java) to connect to a Cloud SQL instance from Dataflow within the same project.
I'm getting the following error:
java.sql.SQLException: Cannot create PoolableConnectionFactory (Communications link failure
The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.)
According to the documentation the dataflow service account *@dataflow-service-producer-prod.iam.gserviceaccount.com should have access to all resources within the same project if he's got "Editor" permissions.
When I run the same Dataflow job with DirectRunner everything works fine.
This is the code I'm using:
private static String JDBC_URL = "jdbc:mysql://myip:3306/mydb?verifyServerCertificate=false&useSSL=true";
PCollection < KV < String, Double >> exchangeRates = p.apply(JdbcIO. < KV < String, Double >> read()
.withDataSourceConfiguration(JdbcIO.DataSourceConfiguration.create("com.mysql.jdbc.Driver", JDBC_URL)
.withUsername(JDBC_USER).withPassword(JDBC_PW))
.withQuery(
"SELECT CurrencyCode, ExchangeRate FROM mydb.mytable")
.withCoder(KvCoder.of(StringUtf8Coder.of(), DoubleCoder.of()))
.withRowMapper(new JdbcIO.RowMapper < KV < String, Double >> () {
public KV < String, Double > mapRow(ResultSet resultSet) throws Exception {
return KV.of(resultSet.getString(1), resultSet.getDouble(2));
}
}));
EDIT:
Using the following approach outside of beam within another dataflow job seems to work fine with DataflowRunner which tells me that the database might not be the problem.
java.sql.Connection connection = DriverManager.getConnection(JDBC_URL, JDBC_USER, JDBC_PW);
Following these instructions on how to connect to Cloud SQL from Java:
https://cloud.google.com/sql/docs/mysql/connect-external-app#java
I managed to make it work.
This is what the code looks like (you must replace MYDBNAME, MYSQLINSTANCE, USER and PASSWORD with your values.
Heads up: MYSQLINSTANCE format is project:zone:instancename.
And I'm using a custom class (Customer) to store the values for each row, instead of key-value pairs.
p.apply(JdbcIO. <Customer> read()
.withDataSourceConfiguration(
JdbcIO.DataSourceConfiguration.create(
"com.mysql.jdbc.Driver",
"jdbc:mysql://google/MYDBNAME?cloudSqlInstance=MYSQLINSTANCE&socketFactory=com.google.cloud.sql.mysql.SocketFactory&user=USER&password=PASSWORD&useUnicode=true&characterEncoding=UTF-8"
)
)
.withQuery( "SELECT CustomerId, Name, Location, Email FROM Customers" )
.withCoder( AvroCoder.of(Customer.class) )
.withRowMapper(
new JdbcIO.RowMapper < Customer > ()
{
@Override
public Customer mapRow(java.sql.ResultSet resultSet) throws Exception
{
final Logger LOG = LoggerFactory.getLogger(CloudSqlToBq.class);
LOG.info(resultSet.getString(2));
Customer customer = new Customer(resultSet.getInt(1), resultSet.getString(2), resultSet.getString(3), resultSet.getString(3));
return customer;
}
}
)
);
I hope this helps.
Hi it worked for me in the way u did it.Additionaly i removed withusername and password methods from the db configuration method and my pipeline configurations looks like below
PCollection < KV < Double, Double >> exchangeRates = p.apply(JdbcIO. < KV < Double, Double >> read()
.withDataSourceConfiguration(JdbcIO.DataSourceConfiguration.create("com.mysql.jdbc.Driver", "jdbc:mysql://ip:3306/dbname?user=root&password=root&useUnicode=true&characterEncoding=UTF-8")
)
.withQuery(
"SELECT PERIOD_YEAR, PERIOD_YEAR FROM SALE")
.withCoder(KvCoder.of(DoubleCoder.of(), DoubleCoder.of()))
.withRowMapper(new JdbcIO.RowMapper < KV < Double, Double >> () {
@Override
public KV<Double, Double> mapRow(java.sql.ResultSet resultSet) throws Exception {
LOG.info(resultSet.getDouble(1)+ "Came");
return KV.of(resultSet.getDouble(1), resultSet.getDouble(2));
}
}));
Hope this will help
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With