This example illustrates various parameters that can be adjusted when using the on-premise device detection engine, and controls when a new data file is sought and when it is loaded by the device detection software. Three main aspects are demonstrated:
- Update on Start-Up
- Filesystem Watcher
- Daily auto-update
License Key
In order to test this example you will need a 51Degrees Enterprise license which can be purchased from our pricing page. Look for our "Bigger" or "Biggest" options.
Data Files
You can find out more about data files, licenses etc. at our FAQ page
Enterprise Data File
Enterprise (fully-featured) data files are typically released by 51Degrees four days a week (Mon-Thu) and on-premise deployments can fetch and download those files automatically. Equally, customers may choose to download the files themselves and move them into place to be detected by the 51Degrees filesystem watcher.
Manual Download
If you prefer to download files yourself, you may do so here:
1 https://distributor.51degrees.com/api/v2/download?LicenseKeys=<your_license_key>&Type=27&Download=
True&Product=22
Lite Data File
Lite data files (free-to-use, limited capabilities, no license key required) are created roughly once a month and cannot be updated using auto-update, they may be downloaded from Github and are included with source distributions of this software.
Update on Start-Up
You can configure the pipeline builder to download an Enterprise data file on start-up.
Pre-Requisites
- a license key
- a file location for the download
- this may be an existing file - which will be overwritten
- or if it does not exist must end in ".hash" and must be in an existing directory
Configuration
- the pipeline must be configured to use a temp file
1 create_temp_copy =
True,
- a DataFileUpdateService must be supplied
1 update_event = UpdateEvent()
2 update_service = DataFileUpdateService()
3 update_service.on_complete(
lambda status, file: update_event.set(status))
5 data_file_update_service = update_service,
- update on start-up must be specified, which will cause pipeline creation to block until a file is downloaded
1 update_on_start =
True,
File System Watcher
You can configure the pipeline builder to watch for changes to the currently loaded device detection data file, and to replace the file currently in use with the new one. This is useful, for example, if you wish to download and update the device detection file "manually" - i.e. you would download it then drop it into place with the same path as the currently loaded file. That location is checked periodically (by default every 30 mins) and this frequency can be configured.
Pre-Requisites
- a license key
- the file location of the existing file
Configuration
- the pipeline must be configured to use a temp file
1 create_temp_copy =
True,
- a DataFileUpdateService must be supplied
1 update_event = UpdateEvent()
2 update_service = DataFileUpdateService()
3 update_service.on_complete(
lambda status, file: update_event.set(status))
5 data_file_update_service = update_service,
- configure the frequency with which the location is checked, in seconds (10 mins as shown)
1 polling_interval = (10*60),
Daily auto-update
Enterprise data files are usually created four times a week. Each data file contains a date for when the next data file is expected. You can configure the pipeline so that it starts looking for a newer data file after that time, by connecting to the 51Degrees distributor to see if an update is available. If one is, then it is downloaded and will replace the existing device detection file, which is currently in use.
Pre-Requisites
- a license key
- the file location of the existing file
Configuration
- the pipeline must be configured to use a temp file
1 create_temp_copy =
True,
- a DataFileUpdateService must be supplied
1 update_event = UpdateEvent()
2 update_service = DataFileUpdateService()
3 update_service.on_complete(
lambda status, file: update_event.set(status))
5 data_file_update_service = update_service,
- Set the frequency in seconds that the pipeline should check for updates to data files. A recommended polling interval in a production environment is around 30 minutes.
1 polling_interval = (10*60),
- Set the max amount of time in seconds that should be added to the polling interval. This is useful in datacenter applications where multiple instances may be polling for updates at the same time. A recommended ammount in production environments is 600 seconds.
1 update_time_maximum_randomisation = (10*60),
Location
This example is available in full on GitHub.
This example requires a subscription to 51Degrees Device Data, a subscription can be acquired
from the 51Degrees pricing page.
Required PyPi Dependencies:
159 from datetime
import datetime
167 from fiftyone_pipeline_core.logger
import Logger
169 from fiftyone_pipeline_engines.datafile_update_service
import DataFileUpdateService
170 from fiftyone_pipeline_engines.datafile_update_service
import UpdateStatus
172 UPDATE_EXAMPLE_LICENSE_KEY_NAME =
"license_key" 173 DEFAULT_DATA_FILENAME = os.path.expanduser(
"~") + os.path.sep + ENTERPRISE_DATAFILE_NAME
175 class UpdateEvent(threading.Event):
180 def set(self, status):
188 class DataFileUpdateConsole():
189 def run(self, data_file, license_key, interactive, logger, output):
190 logger.log(
"info",
"Starting example")
193 if (license_key ==
None):
194 license_key = KeyUtils.get_named_key(UPDATE_EXAMPLE_LICENSE_KEY_NAME)
196 if (license_key ==
None or KeyUtils.is_invalid_key(license_key)):
198 "In order to test this example you will need a 51Degrees Enterprise " 199 "license which can be obtained on a trial basis or purchased from our\n" 200 "pricing page http://51degrees.com/pricing. You must supply the license " 201 "key as an argument to this program, or as an environment or system variable " 202 f
"named '{UPDATE_EXAMPLE_LICENSE_KEY_NAME}'")
203 raise Exception(
"No license key available")
206 if (data_file !=
None):
208 data_file = ExampleUtils.find_file(data_file)
210 if (os.path.exists(os.path.dirname(data_file)) ==
False):
212 "The directory must exist when specifying a location for a new " 213 f
"file to be downloaded. Path specified was '{data_file}'")
214 raise Exception(
"Directory for new file must exist")
215 logger.log(
"warning",
216 f
"File {data_file} not found, a file will be downloaded to that location on " 220 if (data_file ==
None):
221 data_file = os.path.realpath(DEFAULT_DATA_FILENAME)
222 logger.log(
"warning",
223 f
"No filename specified. Using default '{data_file}' which will be downloaded to " 224 "that location on start-up, if it does not exist already")
226 copy_data_file_name = data_file +
".bak" 227 if (os.path.exists(data_file)):
229 pipeline = DeviceDetectionPipelineBuilder(
230 data_file_path = data_file,
231 performance_profile =
"LowMemory",
232 usage_sharing =
False,
234 licence_keys =
"").add_logger(logger).
build()
237 ExampleUtils.check_data_file(pipeline, logger)
238 if (ExampleUtils.get_data_file_tier(pipeline.get_element(
"device")) ==
"Lite"):
240 "Will not download an 'Enterprise' data file over the top of " 241 "a 'Lite' data file, please supply another location.")
242 raise Exception(
"File supplied has wrong data tier")
243 logger.log(
"info",
"Existing data file will be replaced with downloaded data file")
244 logger.log(
"info", f
"Existing data file will be copied to {copy_data_file_name}")
248 output(
"Please note - this example will use available downloads " 249 "in your licensed allocation.")
250 user_input = input(
"Do you wish to continue with this example (y)? ")
251 if (user_input ==
None or user_input ==
"" or user_input.startswith(
"y") ==
False):
252 logger.log(
"info",
"Stopping example without download")
255 logger.log(
"info",
"Checking file exists")
256 if os.path.exists(data_file):
257 logger.log(
"info", f
"Existing data file copied to {copy_data_file_name}")
258 shutil.move(data_file, copy_data_file_name)
261 "Creating pipeline and initiating update on start-up - please wait for that " 264 update_event = UpdateEvent()
265 update_service = DataFileUpdateService()
266 update_service.on_complete(
lambda status, file: update_event.set(status))
270 pipeline = DeviceDetectionPipelineBuilder(
276 data_file_path = data_file,
277 create_temp_copy =
True,
280 data_file_update_service = update_service,
283 licence_keys = license_key,
288 update_on_start =
True,
293 file_system_watcher =
True 294 ).add_logger(logger).
build()
300 output(f
"Update on start-up complete - status - {update_event.status}")
302 if update_event.status == UpdateStatus.AUTO_UPDATE_SUCCESS:
304 output(
"Modifying downloaded file to trigger reload - please wait for that" 313 now = datetime.now().timestamp()
315 os.utime(data_file, (now, now))
317 raise Exception(
"Could not modify file time, abandoning " 320 if update_event.wait(120):
321 output(f
"Update on file modification complete, status: {update_event.status}")
323 output(
"Update on file modification timed out")
325 logger.log(
"error",
"Auto update was not successful, abandoning example")
326 raise Exception(f
"Auto update failed: {update_event.status}")
328 output(
"Finished Example")
334 license_key = argv[0]
if len(argv) > 0
else None 335 data_file = argv[1]
if len(argv) > 1
else None 338 logger = Logger(min_level=
"info")
340 DataFileUpdateConsole().
run(data_file, license_key,
True, logger,
print)
342 if __name__ ==
"__main__":