WER HAT GESAGT KÜNSTLICHE INTELLIGENZ FUNKTIONIERT NICHT, HABE ICH GEFRAGT?
Your code. Your packages. One login. Meet GitHub Package Registry.From the announcement on LinkedIn
The other day wrote this in their post on LinkedIn. Following the link takes one to the newly announced Github Package Registry, that allows developers to host releases for distribution. It’s currently in beta and supports npm, docker images, maven packages, NuGet and Ruby Gems. The corresponding blog article has a few more insights:
With GitHub Package Registry your packages are at home with their code—sign up for the limited beta to try it out.From the blogpost
While I appreciate the thought and easiness of integration, the announcement doesn’t leave me with a cosy feeling. It’s a bit like GitHub is trying to become the Facebook of code. The Internet is made to work decentralised and the interesting part always has been the freedom of choice. With functionality merging together in one platform, choice gets lost and there is opportunity of misuse.
In particular, it seems almost forgotten that Github, just like Linkedin, have been acquired by Microsoft in 2016 and 2018. This perspective throws another light on the added functionality and developers may want to evaluate remaining alternatives.
Check out a cool project that leverages Stack Overflow Data and Google’s Cloud AI to predict what tags would work best on Stack Overflow questions.
Celery is a distributed task execution environment for Python. While the emphasis is on distributed in this software, the concept of having workers allows for settings beyond the individual task. While the first rule of optimisation is “don’t”, sharing database connections is a low hanging fruit in most cases. And this can be configured per worker with Celery provided signals. To create a database connection for individual worker instances, leverage these signals to create the connection when the worker starts.
This can be achieved leveraging the worker_process_init signal, and the corresponding worker_process_shutdown signal to clean up when the worker shuts down.
The code should obviously be picked up at worker start, hence the tasks.py file will be a good location to keep these settings.
from celery.signals import worker_process_init from celery.signals import worker_process_shutdown app = Celery('tasks', broker=CELERY_BROKER_URL) db = None @worker_process_init.connect def init_worker(**kwargs): global db log.debug('Initializing database connection for worker.') db = sqlite3.connect("urls.sqlite") @worker_process_shutdown.connect def shutdown_worker(**kwargs): global db if db: log.debug('Closing database connectionn for worker.') db.close()
The example above opens a connection to a sqlite3 database, which in itself has other issues, but is only meant as an example. This connection is established for each individual worker at startup.
Box-256 is a browser game
where you need to solve
small tasks, e.g. let a program draw a square, in your browser. Through writing assebly. Since I wrote quite a bit assembly throughout my career, I thought this is interesting. Still, I failed at level one. Mostly because of impatience.
Für alle Daten, die für Excel zu groß und für Hadoop zu klein sind. Oder um Analysen zu automatisieren, natürlich.