I'm currently writing a data collection service for multiple services. There are probably 5 different API Endpoints with differing hosts & port numbers. I wanted to create a settings file for this but thought that the .ini should be a better place, or so I thought...
My development.ini looks something like this:
[app:main]
use = egg:awesomeproject
auth.tkt = 'abc'
auth.secret = 'I love python'
mongodb.host = 'somehost'
mongodb.port= 6379
[server:main]
use = egg:waitress#main
host = 0.0.0.0
port = 6543
[user:sg:qa]
host = 127.0.0.1
port = 1234
[customer:sg:qa]
host = 127.0.0.2
port = 4567
I tried to access the custom sections within a pyramid event like such:
def add_api_path(event):
request = event.request
settings = request.registry.settings
_type = 'customer:sg:qa'
base_config = settings[_type]
But that didn't work, because settings is actually a dict of the [app:main]
attributes. Can someone teach me the way to access the sections the Pyramid way? I read about another way, using ConfigParser, but I wanted to ask if there's any other easier way in Pyramid first.
If you want to do that you'll have to parse the config file yourself. The section-isolation behavior you're seeing is intentional.
def main(global_conf, **settings):
parser = ConfigParser({'here': global_conf['__here__']})
parser.read(global_conf['__file__'])
for k, v in parser.items('user:sg:qa'):
settings['user:sg:qa:' + k] = v
config = Configurator(settings=settings)
Then later you can grab the settings:
request.registry.settings['user:sg:qa:host']
update
In Pyramid 1.9 the ini parsing was made pluggable and a new library was created to assist in loading arbitrary sections of the file in a standard way. Below is the updated example:
import plaster
def main(global_conf, **settings):
user_settings = plaster.get_settings(global_conf['__file__'], 'user:sg:qa')
for k, v in user_settings.items():
settings['user:sg:qa:' + k] = v
config = Configurator(settings=settings)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With