This example illustrates various parameters that can be adjusted when using the on-premise device detection engine, and controls when a new data file is sought and when it is loaded by the device detection software. Three main aspects are demonstrated:
- Update on Start-Up
- Filesystem Watcher
- Daily auto-update
License Key
In order to test this example you will need a 51Degrees Enterprise license which can be purchased from our pricing page. Look for our "Bigger" or "Biggest" options.
Data Files
You can find out more about data files, licenses etc. at our FAQ page
Enterprise Data File
Enterprise (fully-featured) data files are typically released by 51Degrees four days a week (Mon-Thu) and on-premise deployments can fetch and download those files automatically. Equally, customers may choose to download the files themselves and move them into place to be detected by the 51Degrees filesystem watcher.
Manual Download
If you prefer to download files yourself, you may do so here:
1 https://distributor.51degrees.com/api/v2/download?LicenseKeys=<your_license_key>&Type=27&Download=
True&Product=22
Lite Data File
Lite data files (free-to-use, limited capabilities, no license key required) are created roughly once a month and cannot be updated using auto-update, they may be downloaded from Github and are included with source distributions of this software.
Update on Start-Up
You can configure the pipeline builder to download an Enterprise data file on start-up.
Pre-Requisites
- a license key
- a file location for the download
- this may be an existing file - which will be overwritten
- or if it does not exist must end in ".hash" and must be in an existing directory
Configuration
- the pipeline must be configured to use a temp file
1 create_temp_copy =
True,
- a DataFileUpdateService must be supplied
1 update_event = UpdateEvent()
2 update_service = DataFileUpdateService()
3 update_service.on_complete(
lambda status, file: update_event.set(status))
5 data_file_update_service = update_service,
- update on start-up must be specified, which will cause pipeline creation to block until a file is downloaded
1 update_on_start =
True,
File System Watcher
You can configure the pipeline builder to watch for changes to the currently loaded device detection data file, and to replace the file currently in use with the new one. This is useful, for example, if you wish to download and update the device detection file "manually" - i.e. you would download it then drop it into place with the same path as the currently loaded file. That location is checked periodically (by default every 30 mins) and this frequency can be configured.
Pre-Requisites
- a license key
- the file location of the existing file
Configuration
- the pipeline must be configured to use a temp file
1 create_temp_copy =
True,
- a DataFileUpdateService must be supplied
1 update_event = UpdateEvent()
2 update_service = DataFileUpdateService()
3 update_service.on_complete(
lambda status, file: update_event.set(status))
5 data_file_update_service = update_service,
- configure the frequency with which the location is checked, in seconds (10 mins as shown)
1 polling_interval = (10*60),
Daily auto-update
Enterprise data files are usually created four times a week. Each data file contains a date for when the next data file is expected. You can configure the pipeline so that it starts looking for a newer data file after that time, by connecting to the 51Degrees distributor to see if an update is available. If one is, then it is downloaded and will replace the existing device detection file, which is currently in use.
Pre-Requisites
- a license key
- the file location of the existing file
Configuration
- the pipeline must be configured to use a temp file
1 create_temp_copy =
True,
- a DataFileUpdateService must be supplied
1 update_event = UpdateEvent()
2 update_service = DataFileUpdateService()
3 update_service.on_complete(
lambda status, file: update_event.set(status))
5 data_file_update_service = update_service,
- Set the frequency in seconds that the pipeline should check for updates to data files. A recommended polling interval in a production environment is around 30 minutes.
1 polling_interval = (10*60),
- Set the max amount of time in seconds that should be added to the polling interval. This is useful in datacenter applications where multiple instances may be polling for updates at the same time. A recommended ammount in production environments is 600 seconds.
1 update_time_maximum_randomisation = (10*60),
Location
This example is available in full on GitHub.
This example requires a subscription to 51Degrees Device Data, a subscription can be acquired
from the 51Degrees pricing page.
Required PyPi Dependencies:
163 from datetime
import datetime
168 from fiftyone_pipeline_core.logger
import Logger
170 from fiftyone_pipeline_engines.datafile_update_service
import DataFileUpdateService
171 from fiftyone_pipeline_engines.datafile_update_service
import UpdateStatus
173 UPDATE_EXAMPLE_LICENSE_KEY_NAME =
"license_key" 174 DEFAULT_DATA_FILENAME = os.path.expanduser(
"~") + os.path.sep + ENTERPRISE_DATAFILE_NAME
177 class UpdateEvent(threading.Event):
182 def set(self, status):
190 class DataFileUpdateConsole():
191 def run(self, data_file, license_key, interactive, logger, output):
192 logger.log(
"info",
"Starting example")
195 if (license_key ==
None):
196 license_key = KeyUtils.get_named_key(UPDATE_EXAMPLE_LICENSE_KEY_NAME)
198 if (license_key ==
None or KeyUtils.is_invalid_key(license_key)):
200 "In order to test this example you will need a 51Degrees Enterprise " 201 "license which can be obtained on a trial basis or purchased from our\n" 202 "pricing page http://51degrees.com/pricing. You must supply the license " 203 "key as an argument to this program, or as an environment or system variable " 204 f
"named '{UPDATE_EXAMPLE_LICENSE_KEY_NAME}'")
205 raise Exception(
"No license key available")
208 if (data_file !=
None):
210 data_file = ExampleUtils.find_file(data_file)
212 if (os.path.exists(os.path.dirname(data_file)) ==
False):
214 "The directory must exist when specifying a location for a new " 215 f
"file to be downloaded. Path specified was '{data_file}'")
216 raise Exception(
"Directory for new file must exist")
217 logger.log(
"warning",
218 f
"File {data_file} not found, a file will be downloaded to that location on " 222 if (data_file ==
None):
223 data_file = os.path.realpath(DEFAULT_DATA_FILENAME)
224 logger.log(
"warning",
225 f
"No filename specified. Using default '{data_file}' which will be downloaded to " 226 "that location on start-up, if it does not exist already")
228 copy_data_file_name = data_file +
".bak" 229 if (os.path.exists(data_file)):
231 pipeline = DeviceDetectionPipelineBuilder(
232 data_file_path = data_file,
233 performance_profile =
"LowMemory",
234 usage_sharing =
False,
236 licence_keys =
"").add_logger(logger).
build()
239 ExampleUtils.check_data_file(pipeline, logger)
240 if (ExampleUtils.get_data_file_tier(pipeline.get_element(
"device")) ==
"Lite"):
242 "Will not download an 'Enterprise' data file over the top of " 243 "a 'Lite' data file, please supply another location.")
244 raise Exception(
"File supplied has wrong data tier")
245 logger.log(
"info",
"Existing data file will be replaced with downloaded data file")
246 logger.log(
"info", f
"Existing data file will be copied to {copy_data_file_name}")
250 output(
"Please note - this example will use available downloads " 251 "in your licensed allocation.")
252 user_input = input(
"Do you wish to continue with this example (y)? ")
253 if (user_input ==
None or user_input ==
"" or user_input.startswith(
"y") ==
False):
254 logger.log(
"info",
"Stopping example without download")
257 logger.log(
"info",
"Checking file exists")
258 if os.path.exists(data_file):
259 logger.log(
"info", f
"Existing data file copied to {copy_data_file_name}")
260 shutil.copy(data_file, copy_data_file_name)
263 "Creating pipeline and initiating update on start-up - please wait for that " 266 update_event = UpdateEvent()
267 update_service = DataFileUpdateService()
268 update_service.on_complete(
lambda status, file: update_event.set(status))
272 pipeline = DeviceDetectionPipelineBuilder(
278 data_file_path = data_file,
279 create_temp_copy =
True,
282 data_file_update_service = update_service,
285 licence_keys = license_key,
290 update_on_start =
True,
295 file_system_watcher =
True 296 ).add_logger(logger).
build()
302 output(f
"Update on start-up complete - status - {update_event.status}")
304 if update_event.status == UpdateStatus.AUTO_UPDATE_SUCCESS:
306 output(
"Modifying downloaded file to trigger reload - please wait for that" 315 now = datetime.now().timestamp()
317 os.utime(data_file, (now, now))
319 raise Exception(
"Could not modify file time, abandoning " 322 if update_event.wait(120):
323 output(f
"Update on file modification complete, status: {update_event.status}")
325 output(
"Update on file modification timed out")
327 logger.log(
"error",
"Auto update was not successful, abandoning example")
328 error_message = f
"Auto update failed: {update_event.status}" 329 if update_event.status == UpdateStatus.AUTO_UPDATE_ERR_429_TOO_MANY_ATTEMPTS:
330 output(error_message)
332 raise Exception(error_message)
334 output(
"Finished Example")
340 license_key = argv[0]
if len(argv) > 0
else None 341 data_file = argv[1]
if len(argv) > 1
else None 344 logger = Logger(min_level=
"info")
346 DataFileUpdateConsole().
run(data_file, license_key,
True, logger,
print)
348 if __name__ ==
"__main__":