I have created ten polygons representing lakes on a vector layer, and I am trying to cut them from an imported and re-projected vector layer of administrative districts. When I use the 'Cut with polygon from another layer' feature, only nine lakes are removed. The last one overlaps two districts but is not cut from either of them. Grateful for any help at all.
Update. I took a closer look at my problem lake outline, and found that there were two points crossed over, effectively creating a very small loop on my geometry (QGIS highlighted this with a green cross when I zoomed in). I rearranged the two points and retried the cut using the same process and all polygons now cut correctly.
GIMP - Unwanted white borders after applying transparency layer
Which I applied a transparency background:
However, the result isn't perfect at all, as letters still have white borders. You can see this in the following zoom:
What wrong did I do? Or do I have to do something more to remove those white thin borders?
EDIT: it's not only the text that is a problem as the graph also has white borders:
4 Answers 4
I'd say there's no quick and dirty fix for choppy lines, you just gotta recreate it using vectors. The following took me 3 minutes in Photoshop with Circles and Stroke effect:
I'm not going to do it all for you, but all you need is two more half circles and you've got a shape based logo, which should scale beautifully to any size. So that's 6 circles, two semi-circles and a stroke applied to some of them. In your case it's very simple to recreate, but a detailed logo would take a lot longer.
There are various ways to do this, though in your case I'd just knuckle down and spend a little while doing this, if you're proficient with any GD software that creates shapes, this shouldn't even take half an hour and if you aren't, now is a good time to learn a few reusable skills.
In many coastal areas, high numbers of recreationists may exceed ecological capacities. Careful monitoring of visitor flows is a first prerequisite for coastal area management. We show how AIS ship data can be translated into interpretable information on recreational boats and investigate whether AIS can provide monitoring information when compared to nature conservation policy targets. In the Wadden Sea UNESCO World Heritage Site we used nearly 9 million data points to create spatiotemporal patterns for the 2018 recreation season. We combined this with shipping lanes and bathymetry data and compared the resulting patterns with nature protection regulations. Our results show that most of the traffic is concentrated around tidal channels. We also show that exceeding speed limits is not predominant behaviour, but the effect of speeding on birds and seals might be more severe than the data suggests. We mapped favourite tidal flat moor activities, and observed where this occurs in Marine Protected Areas. We conclude that AIS analysis can provide valuable recreational boating monitoring, relevant to sensitive coastal area management in the entire Dutch Wadden Sea for the full recreational season. Broader integration of AIS with radar data and ecological data can add to the power of using AIS.
For each of the three geometry types in the test dataset we performed hypothesis testing on four metrics:
Creation time: The time it takes to create a diff given two versions of a geometry.
Apply time: The time it takes to create version n + 1 of a geometry, given version n and a diff.
Undo time: The time it takes to roll back to version n of a geometry, given version n + 1 and a diff.
Patch size: The physical size of the created diff.
We expect the GeomDiff algorithm to exhibit faster creation-, apply- and, undo-time for point, linestring and polygon geometries compared to the other algorithms. In addition, we expect the GeomDiff algorithm to produce smaller patches.
The statistical testing was performed using the following procedure, implemented as a Python script . All statistical tests were performed using a significance level of 0.05. For each metric of each geometry type the recorded data for each of the four algorithms was loaded. First, all errors were counted, recorded (see Table 6), and then removed before further analysis. An error is either an exception thrown by the code, or an instance where the patch did not create the expected result.
Second, a D’Agostino and Pearson’s test  was applied to check each group for normal distribution. Since none of the groups were normally distributed (p < 0.05), a Kruskal-Wallis H-test  was then applied to test H0, that the samples from all algorithms came from the same distribution. Since H0 was rejected in all cases (p < 0.05), we continued with a post hoc test to perform pairwise comparisons between the four algorithms. Using Conover’s test , we found that none of the pairs where statistically similar (p < 0.05). This means that all differences between the mean values for each algorithm are significant.
For point geometries (Table 3), a total of 1,335,489 geometry pairs were checked for each algorithm. Overall, the BinaryDiff algorithm is slower than the fastest algorithm by a factor of 1000 on create and apply. The TextDiff and JsonDiff algorithms show comparable results, apart from patch size. The GeomDiff algorithm produces the smallest patch in the shortest time and is also the fastest to apply and undo.
For linestring geometries (Table 4), a total of 813,503 geometry pairs were checked for each algorithm. The mean number of vertices is 24, the 99th percentile 236. When it comes to performance, the GeomDiff algorithm is considerably slower to create patches, albeit with a large standard deviation, but it is still the fastest on create and undo time. The JsonDiff algorithm is the fastest to create patches, but the patches created by the JsonDiff algorithm are on average larger than patches created by the BinaryDiff algorithm by a factor of 8.5.
For polygon geometries (Table 5), a total of 433,776 polygon pairs with a mean vertex count of 28 (99th percentile 299) were checked. In terms of performance, the polygon dataset exhibits much the same trends as the linestring data. The standard deviations are large, and the BinaryDiff and GeomDiff algorithms are considerably slower than TextDiff and JsonDiff when it comes to create time, but at the same time they produce the smallest patches.
The error counts (Table 6) show that the GeomDiff algorithm encountered 22 and 34 create errors, and 33 and 45 patch and undo errors on linestrings and polygons, respectively. The TextDiff algorithm failed to undo 38,480 linestring pairs (5%) and 18,396 (4%) polygon pairs correctly.
For point geometries the rates are close to zero (< 1 ‰) for all metrics.
The create errors for the TextDiff algorithm are all “Invalid URI: The Uri string is too long.”. This error originates in the Diff Match Patch library, which uses URL encoding provided by the C# standard library. This shows that the limiting factor for string lengths, and by extension vertex count, are the URL encoding method.
For the GeomDiff algorithm, all create errors are “Timed out after 60000 ms”. This is a hard limit built into the GeomDiff library to avoid long-running operations to block for an unreasonable amount of time.
Vertex number effects
For linestring and polygon geometries, the GeomDiff algorithm exhibits an unusually large standard deviation on the Create Time metric. In order to investigate possible causes for this, we identified the upper 99 percentile and removed observations with values higher than this. This is shown in Table 7. We see that by removing 1% of the observations the standard deviation is reduced by two orders of magnitude.
One possible explanation for this is that the create time for the GeomDiff algorithm increases as the number of geometry vertices increase. This explanation is supported by the create failures on 22 linestring and 34 polygon geometries. In these cases, the algorithm ran for 60 s before timing out. Examining the geometries which caused the errors, we find an average vertex count of 1677 and 1576 for linestrings and polygons, respectively. For the top 1 (slowest) percentile, the vertex count averages were 300 and 364. These numbers are both a substantial increase from the full population, which on average has a vertex count of 24 for linestrings and 28 for polygons. In other words, large vertex counts seem to indicate long running times.
To further investigate whether the vertex count variable influences create time, we calculated the Pearson correlation coefficient  between creation time and vertex count, as shown in Table 8. We see that the correlation change between the whole population and the top 1 percentile is substantial for the GeomDiff algorithm (+ 0.17 / + 0.81), while it is relatively stable or decreasing for the other algorithms (- 0.02 / - 0.01 for the TextDiff algorithm). Thus, we suspect that the vertex count in linestring and polygon geometries affects the creation time for the GeomDiff algorithm significantly, and especially for large numbers of vertices.
By grouping the create time results by vertex count and computing average creation time for each group (Fig. 2 and Fig. 3), we find that all algorithms except the BinaryDiff algorithm show an increase in creation time with increasing number of vertices. However, for the GeomDiff algorithm, there is a sharp increase when exceeding a vertex count of 500, for both linestrings and polygons.
Average create time for linestring patches, grouped by vertex count
Average create time for polygon patches, grouped by vertex count
Speaking in Engineering Drawing terms, there are two methods to generate projections of an object. First angle projections and Third-angle projections.These projections are developed based on assuming how the object is conceptually viewed in a quadrant system.
In First Angle Projection we place our object in the First Quadrant (see above figure). This means that the Vertical Plane is behind the object and the Horizontal Plane is underneath the object.
In Third Angle Projection the Object is placed in the Third Quadrant. This means that the Vertical Plane is in front of the object and the Horizontal Plane is above the object.
These changes in the position of the views are the only difference between projection methods.
So basically, visualize an object being viewed from these different planes. The Frontal plane gives you the side view. Horizontal plane gives you the top or bottom views (based on the angle of projection).
And visualization is the key to becoming good in CAD drawings, 3D modelling etc. It takes a bit of practice and imagination but it's easy to get the hang of :)
The authors wish to acknowledge funding by the Generalitat de Catalunya (Grup Consolidat de Recerca: Grup d’Hidrologia Subterrània (2014-SGR-1377). Mar Alcaraz was funded by a postdoctoral fellowship from the Argentinian National Scientific and Technical Research Council (5043-15/12/2015). Rotman Criollo also acknowledges the support by the Catalan Industrial Doctorates Plan of the Secretariat for Universities and Research, Ministry of Economy and Knowledge of the Generalitat de Catalunya. We acknowledge MIKE by DHI for the sponsored FEFLOW license.
The netCDF vector driver supports reading and writing netCDF files following the Climate and Forecast (CF) Metadata Conventions. Vector datasets can be written using the simple geometry specification of the CF-1.8 convention, or by using the CF-1.6 convention and by writing non-point geometry items as WKT.
Distinguishing the Two Formats¶
Upon reading a netCDF file, the driver will attempt to read the global Conventions attribute. If it’s value is CF-1.8 or higher (in this exact format, as specified in the CF convention) then the driver will treat the netCDF file as one that has CF-1.8 geometries contained within it. If the Conventions attribute has a value of CF-1.6, the the file will be treated as following the CF-1.6 convention.
CF-1.8 Writing Limitations¶
Writing to a CF-1.8 netCDF dataset poses some limitations. Only writing the feature types specified by the CF-1.8 standard (see section Geometry for more details) are supported, and measured features are only partially supported. Other geometries, such as non-simple curve geometries, are not supported in any way.
CF-1.8 datasets also do not support the append access mode.
There are what are considered reserved variable names for CF-1.8 datasets. These variable names are used by the driver to store its metadata. Refrain from using these names as layer names to avoid naming conflicts when writing datasets with multiple layers.
Suppose a layer in a CF-1.8 dataset has the name LAYER with a field with name FIELD. Then the following names would be considered reserved:
LAYER_node_coordinates: used to store point information
LAYER_node_count: used to store per shape point count information (not created if LAYER has a geometry type of Point)
LAYER_part_node_count: used to store per part point count information (only created if LAYER consists of MultiLineStrings, MultiPolygons, or has at least one Polygon with interior rings)
LAYER_interior_ring: used to store interior ring information (only created if LAYER consists of at least one Polygon with interior rings)
LAYER_field_FIELD: used to store field information for FIELD.
These names are the only reserved names applying to CF-1.8 datasets.
CF-1.6/WKT datasets are not limited to the aforementioned restrictions.
A divide-and-conquer method for space–time series prediction
Space–time series can be partitioned into space–time smooth and space–time rough, which represent different scale characteristics. However, most existing methods for space–time series prediction directly address space–time series as a whole and do not consider the interaction between space–time smooth and space–time rough in the process of prediction. This will possibly affect the accuracy of space–time series prediction, because the interaction between these two components (i.e., space–time smooth and space–time rough) may cause one of them as dominant component, thus weakening the behavior of the other. Therefore, a divide-and-conquer method for space–time prediction is proposed in this paper. First, the observational fine-grained data are decomposed into two components: coarse-grained data and the residual terms of fine-grained data. These two components are then modeled, respectively. Finally, the predicted values of the fine-grained data are obtained by integrating the predicted values of the coarse-grained data with the residual terms. The experimental results of two groups of different space–time series demonstrated the effectiveness of the divide-and-conquer method.
This is a preview of subscription content, access via your institution.
The required capabilities specified in this clause serve as the base for options specified in clause Options and extensions specified in clause Registered Extensions (Normative). All gpkg_* tables and views and all tiles user data tables specified in this standard SHALL have only the specified columns and table constraints. Any features user data tables MAY have columns in addition to those specified. All specified table, view, column, trigger, and constraint name values SHALL be lowercase.
The mandatory core capabilities defined in sub clauses and requirement statements of this clause SHALL be implemented by every GeoPackage and GeoPackage SQLite Configuration.
1.1.1. SQLite Container
The SQLite software library provides a self-contained, single-file, cross-platform, serverless, transactional, open source RDBMS container. The GeoPackage standard defines a SQL database schema designed for use with the SQLite software library. Using SQLite as the basis for GeoPackage simplifies production, distribution and use of GeoPackages and assists in guaranteeing the integrity of the data they contain.
"Self-contained" means that container software requires very minimal support from external libraries or from the operating system. "Single-file" means that a container not currently opened by any software application consists of a single file in a file system supported by a computing platform operating system. "Cross-platform" means that a container file MAY be created and loaded with data on one computing platform, and used and updated on another, even if they use different operating systems, file systems, and byte order (endian) conventions. "Serverless" means that the RDBMS container is implemented without any intermediary server process, and accessed directly by application software. "Transactional" means that RDBMS transactions guarantee that all changes to data in the container are Atomic, Consistent, Isolated, and Durable (ACID) despite program crashes, operating system crashes, and power failures.
126.96.36.199.1. File Format
A GeoPackage SHALL be a SQLite  database file using version 3 of the SQLite file format  . The first 16 bytes of a GeoPackage SHALL be the null-terminated ASCII [B4] string "SQLite format 3" [K1] [K2]
A GeoPackage SHALL contain a value of 0x47504B47 ("GPKG" in ASCII) in the "application_id" field of the SQLite database header to indicate that it is a GeoPackage. [K3] A GeoPackage SHALL contain an appropriate value in "user_version" field of the SQLite database header to indicate its version. The value SHALL be in integer with a major version, two-digit minor version, and two-digit bug-fix. For GeoPackage Version 1.2 this value is 0x000027D8 (the hexadecimal value for 10200). [K4]
The maximum size of a GeoPackage file is about 140TB. In practice a lower size limit MAY be imposed by the filesystem to which the file is written. Many mobile devices require external memory cards to be formatted using the FAT32 file system which imposes a maximum size limit of 4GB.
188.8.131.52.2. File Extension Name
A GeoPackage SHALL have the file extension name ".gpkg".
It is RECOMMENDED that Extended GeoPackages use the file extension ".gpkx", but this is NOT a GeoPackage requirement.
184.108.40.206.3. File Contents
A GeoPackage SHALL only contain the data elements (tables, columns, or values) and SQL constructs (views, constraints, or triggers) specified in the core of this encoding standard (Features, Tiles, and Attributes). Extended GeoPackages MAY contain additional data elements and SQL constructs as specified through the Extension Mechanism.
The GeoPackage designation is designed to provide maximum interoperability between applications. In an Extended GeoPackage, the extension mechanism is used to provide additional capabilities in a way that maintains interoperability as much as possible. Developers are encouraged to consider the implications of extensions when designing their applications. Best practices include the following:
Designing in a way that anticipates the presence of unexpected extensions, e.g., gracefully handling unexpected columns, values, or encodings.
Using the RTree Spatial Indexes extension for GeoPackages containing a non-trivial amount of vector data.
Using the WKT for Coordinate Reference Systems extension, which is strongly recommended due to inherent weaknesses in the original standard for encoding coordinate reference systems.
The columns of tables in a GeoPackage SHALL only be declared using one of the data types specified in table GeoPackage Data Types. Extended GeoPackages MAY contain additional data types as specified through the Extension Mechanism.
A boolean value representing true or false. Stored as SQLite INTEGER with value 0 for false or 1 for true.
8-bit signed two’s complement integer. Stored as SQLite INTEGER with values in the range [-128, 127].
16-bit signed two’s complement integer. Stored as SQLite INTEGER with values in the range [-32768, 32767].
32-bit signed two’s complement integer. Stored as SQLite INTEGER with values in the range [-2147483648, 2147483647].
64-bit signed two’s complement integer. Stored as SQLite INTEGER with values in the range [-9223372036854775808, 9223372036854775807].
32-bit IEEE floating point number. Stored as SQLite REAL limited to values that can be represented as a 4-byte IEEE floating point number.
64-bit IEEE floating point number. Stored as SQLite REAL.
Variable length string encoded in either UTF-8 or UTF-16, determined by PRAGMA encoding see http://www.sqlite.org/pragma.html#pragma_encoding. The optional maxchar_count defines the maximum number of characters in the string. If not specified, the length is unbounded. The count is provided for informational purposes, and applications MAY choose to truncate longer strings if encountered. When present, it is best practice for applications to adhere to the character count. Stored as SQLite TEXT.
Variable length binary data. The optional max_size defines the maximum number of bytes in the blob. If not specified, the length is unbounded. The size is provided for informational purposes. When present, it is best practice for applications adhere to the maximum blob size. Stored as SQLite BLOB.
Geometry encoded as per clause Geometry Encoding. <geometry type_name> is one of the core geometry types listed in Geometry Types (Normative) encoded per clause 2.1.3 or a geometry type encoded per an extension such as GeoPackage Non-Linear Geometry Types. Geometry Types XY, XYZ, XYM and XYZM geometries use the same data type. Stored as SQLite BLOB.
ISO-8601 date string in the form YYYY-MM-DD encoded in either UTF-8 or UTF-16. See TEXT. Stored as SQLite TEXT.
ISO-8601 date/time string in the form YYYY-MM-DDTHH:MM:SS.SSSZ with T separator character and Z suffix for coordinated universal time (UTC) encoded in either UTF-8 or UTF-16. See TEXT. Stored as SQLite TEXT.
220.127.116.11.4. File Integrity
The SQLite PRAGMA integrity_check SQL command SHALL return "ok" for a GeoPackage file. [K5]
The SQLite PRAGMA foreign_key_check SQL with no parameter value SHALL return an empty result set indicating no invalid foreign key values for a GeoPackage file.
18.104.22.168.1. Structured Query Language (SQL)
A GeoPackage SQLite Configuration SHALL provide SQL access to GeoPackage contents via SQLite version 3  software APIs. [K6]
22.214.171.124.2. Every GPKG SQLite Configuration
The SQLite  library has many compile time and run time options that MAY be used to configure SQLite for different uses. Use of SQLITE_OMIT options is not recommended because certain elements of the GeoPackage standard depend on the availability of SQLite functionality at runtime.
Every GeoPackage SQLite Configuration SHALL have the SQLite library compile time options specified in clause 126.96.36.199.2 table [every_gpkg_sqlite_config_table].
1.1.2. Spatial Reference Systems
188.8.131.52.1. Table Definition
A GeoPackage SHALL include a gpkg_spatial_ref_sys table per clause 184.108.40.206.1 Table Definition, Table Spatial Ref Sys Table Definition and Table gpkg_spatial_ref_sys Table Definition SQL.
A table named gpkg_spatial_ref_sys is the first component of the standard SQL schema for simple features described in clause Simple Features SQL Introduction below. The coordinate reference system definitions it contains are referenced by the GeoPackage gpkg_contents and gpkg_geometry_columns tables to relate the vector and tile data in user tables to locations on the earth.
The gpkg_spatial_ref_sys table includes the columns specified in SQL/MM (ISO 13249-3)  and shown in Spatial Ref Sys Table Definition below containing data that defines spatial reference systems. Views of this table MAY be used to provide compatibility with the SQL/MM  (see SQL/MM View of gpkg_spatial_ref_sys Definition SQL (Informative)) and OGC Simple Features SQL  (Table 21) standards.
Human readable name of this SRS
Unique identifier for each Spatial Reference System within a GeoPackage
Case-insensitive name of the defining organization e.g. EPSG or epsg
Numeric ID of the Spatial Reference System assigned by the organization
Well-known Text  Representation of the Spatial Reference System
Human readable description of this SRS
220.127.116.11.2. Table Data Values
Definition column WKT values in the gpkg_spatial_ref_sys table define the Spatial Reference Systems used by feature geometries and tile images, unless these SRSs are unknown and therefore undefined as specified in Requirement 11. Values are constructed per the EBNF syntax in  clause 7. EBNF name and number values may be obtained from any specified authority, e.g. . For example, see the return value in [spatial_ref_sys_data_values_default] Test Method step (3) used to test the definition for WGS-84 per Requirement 11:
The gpkg_spatial_ref_sys table SHALL contain at a minimum the records listed in Spatial Ref Sys Table Records. The record with an srs_id of 4326 SHALL correspond to WGS-84  as defined by EPSG [B3] in 4326 . The record with an srs_id of -1 SHALL be used for undefined Cartesian coordinate reference systems. The record with an srs_id of 0 SHALL be used for undefined geographic coordinate reference systems.