I just need to dump specific tables from my database such that these specific tables (3 tables to be exact out of 200 tables) will now be implemented by DBIx::Class::Schema
.
Here is the command from the docs (https://metacpan.org/pod/dbicdump):
dbicdump -o dump_directory=./lib -o components='["InflateColumn::DateTime"]' -o preserve_case=1 MyApp::Schema dbi:mysql:database=database_name user pass;
I tried appending the table name after the database_name but no luck, it still dumps all the tables in the specified database. Need help. I can't find anything in the docs.
Also Out of topic question :
What does these means? -o components='["InflateColumn::DateTime"]' -o preserve_case=1
I also cant find their explanation in the docs.
Thanks
You can pass the option constraint
to the underlying DBIx::Class::Schema::Loader instance so it only selects certain tables. The documentation is a bit vague on this.
These can be specified either as a regex (preferrably on the qr// form), or as an arrayref of arrayrefs. Regexes are matched against the (unqualified) table name, while arrayrefs are matched according to "moniker_parts".
For example:
db_schema => [qw(some_schema other_schema)], moniker_parts => [qw(schema name)], constraint => [ [ qr/\Asome_schema\z/ => qr/\A(?:foo|bar)\z/ ], [ qr/\Aother_schema\z/ => qr/\Abaz\z/ ], ],
In this case only the tables foo and bar in some_schema and baz in other_schema will be dumped.
So what you need to pass to dbicdump
would look something like this.
dbicdump \
-o dump_directory=./lib \
-o components='["InflateColumn::DateTime"]' \
-o preserve_case=1 \
-o constraint='qr/^(?:foo|bar|baz)$/' \
MyApp::Schema dbi:mysql:database=database_name user pass;
That will only give you the tables foo, bar and baz. You need the quoted regular expression without array refs if there is only one schema, and you don't want to use manually preset monikers (which are names used for tables in the generated Schema class).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With