More

Fail to display correctly newly inserted data in PostGIS on Geoserver


I have created a postGis store in Geoserver. I have created layers with points and lines and I display them on an OpenLayers based application. I also use an FME workflow to insert new data to my postGis table. I expected that my geoserver layers would be updated automatically but it didn't happen.

The problem was that I had to set new boundaries in the layers properties. After this I could see some (not all) of my newly inserted data on the map. In different zoom levels I had different information displayed. For example in the upper zoom lever I could see the whole information but zooming in I was seeing less (see figures).

I thought it might be related with caching. I have used the geoserver interface to seed-generate missing tiles and according to geoserver the process was successful.

Lastly I have restarted geoserver and tomcat7 service without any success either.

When I select layer preview in


This behavior I've seen before with point data and have not figured out a workaround for the existing layer. The workaround that I've used in the past is to make a copy of the table in the database, call it something different, and re-create the layer. When you are at the Bounding Box section, make sure to over estimate the bounding box values slightly (or the possible max extent that the data could grow to). This will make sure that edge features do not get trimmed off as you zoom in.


These instructions provide a virtual machine which has been already configured with Boundless Suite, eliminating the need to install Boundless Suite directly on your machine.

All interaction with Boundless Suite will still be done through your host’s standard browser or a terminal connection. There is no need to interact directly with the virtual machine console once it is running.

These instructions should only be used with VirtualBox. This machine has VirtualBox Guest Additions installed, which is required for functionality such as shared folders.

  • Please disable any programs on your system that use ports 2020, 5432, 8080, or 8433.
  • Make sure you have administrative / super-user privileges on your system.
  • You must be able to run a 64-bit virtual machine. 32-bit machines are not supported.
  • You must have at least 2GB of memory and 4GB of free disk space (plus extra space for any loaded data).

To install VirtualBox open Terminal . Navigate to Applications ‣ System Tools ‣ Terminal .

Opening Terminal

Execute the following sequence of commands into the console:

After installation, run VirtualBox. Navigate to File ‣ Import Appliance .

Select the Boundless Suite virtual machine file.

Details about the virtual machine will be displayed. Click Import .

The Boundless Suite license agreement will display. Click Agree to accept.

You will now see the Boundless Suite entry in the list of virtual machines in VirtualBox.

VirtualBox Manager showing Boundless Suite virtual machine

Click to select the virtual machine and then click Shared Folders .

Accessing the shared folder menu

In order to facilitate copying files from your host system to the virtual machine, we recommend creating a shared folder such that any files copied to that folder will be accessible inside the virtual machine. Right-click the blank area of the dialog and select Add shared folder (or press Insert ).

Link to add a new shared folder

  • For Folder Path , select a directory on the host machine that will serve as the shared folder. One good option for this directory would be the Desktop.
  • For Folder Name , enter share .
  • Check Auto-mount .

When finished, click OK , then click OK again to close the Settings page.


Abstract

In the spring of 2013, NASA conducted a field campaign known as Iowa Flood Studies (IFloodS) as part of the Ground Validation (GV) program for the Global Precipitation Measurement (GPM) mission. The purpose of IFloodS was to enhance the understanding of flood-related, space-based observations of precipitation processes in events that transpire worldwide. NASA used a number of scientific instruments such as ground-based weather radars, rain and soil moisture gauges, stream gauges, and disdrometers to monitor rainfall events in Iowa. This article presents the cyberinfrastructure tools and systems that supported the planning, reporting, and management of the field campaign and that allow these data and models to be accessed, evaluated, and shared for research. The authors describe the collaborative informatics tools, which are suitable for the network design, that were used to select the locations in which to place the instruments. How the authors used information technology tools for instrument monitoring, data acquisition, and visualizations after deploying the instruments and how they used a different set of tools to support data analysis and modeling after the campaign are also explained. All data collected during the campaign are available through the Global Hydrology Resource Center (GHRC), a NASA Distributed Active Archive Center (DAAC).


Polk Township, Huntington County, Indiana

According to the 2010 census, the township has a total area of 24.11 square miles (62.4 km 2 ), of which 22.76 square miles (58.9 km 2 ) (or 94.40%) is land and 1.35 square miles (3.5 km 2 ) (or 5.60%) is water. [1]

Cities and towns [ edit ]

Unincorporated communities [ edit ]

Adjacent townships [ edit ]



  • Dallas Township (north)

  • Huntington Township (northeast)

  • Lancaster Township (east)

  • Jefferson Township (southeast)

  • Wayne Township (south)

  • Liberty Township, Wabash County (southwest)

  • Lagro Township, Wabash County (northwest)

Cemeteries [ edit ]

The township contains one cemetery, Monument City Memorial.

Major highways [ edit ]



  • Indiana State Road 105


  • Indiana State Road 124


Mdhntd

Why aren’t there water shutoff valves for each room?

Why does Japan use the same type of AC power outlet as the US?

How do I ask for 2-3 days per week remote work in a job interview?

Human with super efficient metabolism

Shifting tenses in the middle of narration

Will using a resistor in series with a LED to control its voltage increase the total energy expenditure?

Are employers legally allowed to pay employees in goods and services equal to or greater than the minimum wage?

Do beef farmed pastures net remove carbon emissions?

What should we do with manuals from the 80s?

Are there any cons in using rounded corners for bar graphs?

Transition to "Starvation Mode" in Survival Situations

Scam? Phone call from "Department of Social Security" asking me to call back

Can lodestones be used to magnetize crude iron weapons?

How would armour (and combat) change if the fighter didn't need to actually wear it?

Does an Irish VISA WARNING count as "refused entry at the border of any country other than the UK?"

How does the Athlete Feat affect the Ravnica Centaur playable race?

Global BGP Routing only by only importing supernet prefixes

What is the hottest thing in the universe?

How can I find an old paper when the usual methods fail?

Login as another user in PostGIS using QGIS 2.18 without restart

Changing Data Source of Layer in QGISProblem loading PostGIS (Postgres) View layer in QGISConnnect to PostGIS db using QGIS - when not on localhostSimplify PostGIS authentication process from QGIS

I have a PostGIS DB with three roles: admin, user and readonly.

When I set up a connection to this database via "Add PostGIS layer" this connection is saved as it should be. So I login as admin to create some tables. I don't save any login informations.

But now I have no possibility to logout and re-login as another user e.g. to check if I set up roles correctly or whatever.

I tried to reconnect via "Add PostGIS layer" but this has no effect.
The only way to do this seems to close QGIS and restart it.

Is there another way to logout and re-login as another user? For now this has to be done using QGIS 2.18.

Figured out I can run 3 instances of QGIS with 3 different login-informations at the same time. Better than restarting, but not the "solution" I was hoping for.

maybe using "changeDataSource" plugin ,look at this gis.stackexchange.com/questions/62610/…

you can add another connection as the ReadOnly user and add the layer from this account. The behaviour( not editable for example) is observable

Seems to be logical, but unfortunately this does not work. QGIS recognizes the two connections to the same DB as one somehow. And again only the login information I entered first is used. Therefore also changeDataSource plugin doesn't help, as it is using the connections provided by QGIS.

I have a PostGIS DB with three roles: admin, user and readonly.

When I set up a connection to this database via "Add PostGIS layer" this connection is saved as it should be. So I login as admin to create some tables. I don't save any login informations.

But now I have no possibility to logout and re-login as another user e.g. to check if I set up roles correctly or whatever.

I tried to reconnect via "Add PostGIS layer" but this has no effect.
The only way to do this seems to close QGIS and restart it.

Is there another way to logout and re-login as another user? For now this has to be done using QGIS 2.18.

Figured out I can run 3 instances of QGIS with 3 different login-informations at the same time. Better than restarting, but not the "solution" I was hoping for.

maybe using "changeDataSource" plugin ,look at this gis.stackexchange.com/questions/62610/…

you can add another connection as the ReadOnly user and add the layer from this account. The behaviour( not editable for example) is observable

Seems to be logical, but unfortunately this does not work. QGIS recognizes the two connections to the same DB as one somehow. And again only the login information I entered first is used. Therefore also changeDataSource plugin doesn't help, as it is using the connections provided by QGIS.

I have a PostGIS DB with three roles: admin, user and readonly.

When I set up a connection to this database via "Add PostGIS layer" this connection is saved as it should be. So I login as admin to create some tables. I don't save any login informations.

But now I have no possibility to logout and re-login as another user e.g. to check if I set up roles correctly or whatever.

I tried to reconnect via "Add PostGIS layer" but this has no effect.
The only way to do this seems to close QGIS and restart it.

Is there another way to logout and re-login as another user? For now this has to be done using QGIS 2.18.

Figured out I can run 3 instances of QGIS with 3 different login-informations at the same time. Better than restarting, but not the "solution" I was hoping for.

I have a PostGIS DB with three roles: admin, user and readonly.

When I set up a connection to this database via "Add PostGIS layer" this connection is saved as it should be. So I login as admin to create some tables. I don't save any login informations.

But now I have no possibility to logout and re-login as another user e.g. to check if I set up roles correctly or whatever.

I tried to reconnect via "Add PostGIS layer" but this has no effect.
The only way to do this seems to close QGIS and restart it.

Is there another way to logout and re-login as another user? For now this has to be done using QGIS 2.18.

Figured out I can run 3 instances of QGIS with 3 different login-informations at the same time. Better than restarting, but not the "solution" I was hoping for.

maybe using "changeDataSource" plugin ,look at this gis.stackexchange.com/questions/62610/…

you can add another connection as the ReadOnly user and add the layer from this account. The behaviour( not editable for example) is observable

Seems to be logical, but unfortunately this does not work. QGIS recognizes the two connections to the same DB as one somehow. And again only the login information I entered first is used. Therefore also changeDataSource plugin doesn't help, as it is using the connections provided by QGIS.

maybe using "changeDataSource" plugin ,look at this gis.stackexchange.com/questions/62610/…

you can add another connection as the ReadOnly user and add the layer from this account. The behaviour( not editable for example) is observable

Seems to be logical, but unfortunately this does not work. QGIS recognizes the two connections to the same DB as one somehow. And again only the login information I entered first is used. Therefore also changeDataSource plugin doesn't help, as it is using the connections provided by QGIS.

maybe using "changeDataSource" plugin ,look at this gis.stackexchange.com/questions/62610/…

maybe using "changeDataSource" plugin ,look at this gis.stackexchange.com/questions/62610/…

you can add another connection as the ReadOnly user and add the layer from this account. The behaviour( not editable for example) is observable

you can add another connection as the ReadOnly user and add the layer from this account. The behaviour( not editable for example) is observable

Seems to be logical, but unfortunately this does not work. QGIS recognizes the two connections to the same DB as one somehow. And again only the login information I entered first is used. Therefore also changeDataSource plugin doesn't help, as it is using the connections provided by QGIS.

Seems to be logical, but unfortunately this does not work. QGIS recognizes the two connections to the same DB as one somehow. And again only the login information I entered first is used. Therefore also changeDataSource plugin doesn't help, as it is using the connections provided by QGIS.


Field measurements in river embankments: validation and management with spatial database and webGIS

This study focuses on the development of a system with a spatial database and a webGIS able to store, validate and display the data to assist the decision makers in managing early warning systems for river embankment failure. In order to obtain precise results, it is essential to have a tool with ability of managing a large number of data for checking their reliability and for locating them in the space. In this paper, special emphasis was given in the development of procedures to assess the reliability of the measures. For this purpose, the database includes all the information needed to describe the instrument performance, such as the sand pack size and casing diameter of open-standpipe piezometers for evaluating their time lag, and the calibration curves of transducers with the possibility of their updating. The position of the non-functioning instruments is identified through the analysis of the electrical signal and spatial displays, while the analyses of the redundancy and coherence of measures is used for detecting doubtful data. Database and webGIS were applied to the monitoring data of an embankment of the Adige River in Northern Italy. The database and webGIS system has proved to be a suitable and effective tool for the management and validation of real-time data and periodical field measurements.

This is a preview of subscription content, access via your institution.


2 Answers 2

You hava a lot of issues here:

pre_get_posts is not the correct hook to set templates. pre_get_posts are used to alter the ain query query vars just before the SQL are build to run the main query's query

A filter should always return something. Not doing this will have unexpected behavior, and forgetting about this can have you on a wild goose chase for hours debugging the issue.

Using globals to control theme features or to store any kind of data is bad practice and not very safe coding. WordPress has already made such a huge mess of globals, particularly naming conventions. Just check how newbies (that does not know Wordpress) unknowingly use variable like $post and $posts as local variables. These are native globals uses by WordPress, and using them as local variables breaks the values of these globals.

Because of this, something on the page goes wrong, there are no errors, so you are stuck on a wild goose chase trying to debug something that you have unknowingly broke. Globals are pure evil and you should avoid using them. Just think, if you use the variable $wpgo_global_column_layout for the query arguments of a custom query, you will break the value of the template that needs to be set, your template does not load because the value of $wpgo_global_column_layout are not recognized as a valid template name, you are stuffed and don't know why your template does not load as your code are 100% that should load a custom template

is_tax() is the wrong check to use to check whether a post has a certain term or not, is_tax() simply checks if you are on a taxonomy archive or not. You should be using has_term() which do just that, checks if a certain post has a certain term

If you need to set a template for a taxonomy page, single_template is the wrong hook, you should be using the taxonomy_template hook or the more generic template_include filter

In the line $post->post_type == 'advert' || is_tax( 'advert_category' ) , I suspect you are using the wrong operator. You should be using the AND operator. I'm not going to explain this here as I already done something similar here. Note that, with the current setup, whenever you are viewing a post from the post type advert , your condition will return true and fire whether or not the second condition ( is_tax( 'advert_category' ) ) fails.

If you need to target a term according to parent_child order, you simply need to check the term object's $parent property. A value of 0 means the term is a parent, any other value means that the term is a child/grandchild/grand-grandchild/etc term

Lets drop the crappy globals and set the templates properly. I do not know how your theme exactly sets templates through the $wpgo_global_column_layout , but the following should work with priority. I have commented the code to make it easy to follow


5 Answers 5

I haven't yet found any third party batteries that has a chip in them.

As you say, using a battery without the chip doesn't provide the camera with power level metering. Not knowing exacly how much power there is left is of course a bit inconvenient, but that can also cause other problems. The camera uses the power level information to shut down safely when the level gets critically low, and without that information the camera might run out of power in the middle of an I/O operation. If you are shooting pictures the risk for that is not very high, but if you are shooting video when the power runs out that is quite likely to corrupt the video file. It may also corrupt the disk system data, which will make all the files on the card unreadable.

The third party batteries I have seen comes with a special charger, as they can't be charged with the original charger. That means that you have to bring two chargers if you have both types of batteries.

Difference as percentage of camera price: 1%

I've seen several different investigations of generic batteries, and while many of them are quite good, the actual capacity varies wildly. In some cases, when you factor that in, you're actually getting more for your money for the brand-name batteries. It's an unfortunate situation and it's too bad there's not better standardization (and honest labeling), but I think it's best to just consider a few extra batteries to be an extra couple percent on top of the purchase price and, basically, just suck it up.

The Maxtek batteries when purchased directly through them on amazon are guaranteed to work. I bought two on 12Dec2013 and they work just fine and charge fine in the canon charger. They suggest buying directly from them right now to ensure you get the latest with the chip Canon will recognize as they can't be sure Amazon or other retailers have gotten rid of older stock that are not compatible.

Prior to the more recent firmware updates I didn't have any problem with another generic brand working but now it can't communicate with the body.

A recent major photography blog had an article about this issue and they had contacted Maxtek and their advice was as written above (buy from them to ensure latest with chip).

In my experience, the better third-party batteries work just fine, and the cheaper ones don't.

I've had good luck with Watson (which I'm pretty sure is a house brand of B&H, or at least related in some way). My 6D came with one about four-ish years ago, and it still works (even with my 5D Mark IV). I ended up buying a second one when I got my 5D Mark IV to make it a matched pair. :-)

By contrast, the cheap third-party batteries that came with my 6D's third-party battery grip (and the replacements for those after the first two failed) all had chips, too, but the chips stopped working after random periods of time ranging from days to months. I went through. I think six of them before concluding that it wasn't worth bothering to keep asking them to replace the batteries.

If you buy quality third party batteries, such as SterlingTek or MaximalPower from reputable sources you should get just as good performance as the OEM Canon batteries at a significant savings.

Another thing to consider is that the genuine OEM batteries are more likely to be counterfeited and passed off as genuine by shady sellers. Fake third party batteries aren't near as common. After all, if you're going to make a cheap fake, why not mimic the version that sells for $60 instead of the version that sells for $20 or $10 or $5? If you buy a 'genuine" battery from an unauthorized seller it is highly likely you have bought a fake. If you buy "genuine" or third party batteries from authorized, reputable sources you are much more likely to get what you think you are paying for.

The Maximal Power versions of the Canon LP-E6 I bought from amazon.com function just like the OEM batteries supplied with my cameras. So do the SterlingTek LP-E6 batteries I've bought via amazon and the Watson batteries I've bought from B&H. They charge on the same charger, the camera reads the serial # in the battery, and displays the charge level, number of shots, recharge performance, and remembers the date and charge state the last time each battery was used in the camera.¹

These reputable brands are still about 1/4 to 1/3 the cost of Canon OEM.

I also used SterlingTek batteries for my Rebel XTi and 50D. The SterlingTek NB2LH and BP-511A were every bit as good as the Canon batteries for those cameras. The 2200mAH SterlingTek BP-511As lasted much longer per charge than the Canon BP-511A 1390mAH originals. I also tried some of the really cheap generic versions for the XTi and had less than stellar results. They didn't last as long per charge and didn't last as many charge/discharge cycles before they would no longer take a full charge.

¹Not specific to the 5D, 5D Mark II, or 7D but applicable to the 5D Mark III or 7D Mark II and later:

Older LP-E6 third party batteries made prior to around 2012-13 don't fully communicate with Canon bodies released since about 2013 (including the 2012 5D Mark III if the camera was shipped with or updated to firmware version 1.2.3 released in August 2013 or later). The newer chargers supplied since 2013 will also balk at charging the older third party batteries, but do just fine with the newer third party versions that have the newer firmware introduced around 2013 embedded in them.

The older third party batteries, when charged with an older Canon charger or a third party charger, will still power the newer cameras perfectly fine, they just don't give detailed information regarding shutter count since last charge, recharge performance, etc. With some cameras you will be asked to confirm what type of battery you are using.

Canon periodically updates the battery protocol, apparently just to discourage use of third party batteries. Canon older batteries are not (supposed to be²) affected because the firmware in the older batteries already contain some "secret" lines of code that are only needed with the updated protocols. When the newer camera detects a battery without the hidden code it will give you the message to try and scare you into only buying Canon batteries.

² When Canon updated the LP-E6 battery to the LP-E6N and revised the LC-E6E charger they had an issue with many older OEM Canon LP-E6 batteries not charging properly in the new charger.

Since the third party battery manufacturers reverse engineer their batteries, they didn't include the "hidden code" in older copies of their LP-E6 replacements that were reverse engineered from the older Canon batteries upon which they were based because the older cameras do not interact with the "hidden" lines of code.

It's all a cat and mouse game. It usually only takes a few weeks for the top third party battery makers to crack the new protocol and include it in their copies. I use MaximalPower (Amazon is the only authorized seller) and Sterling Tek third party batteries. My older ones function fully in the 5DII and 7D, but have the limited functionality in the 5DIII and 7DII. My newer third party batteries from MaximalPower and Sterling Tek also fully function in the 5DIII and 7DII. The third party batteries seem to also handle more charge/discharge cycles before their performance noticeably degrades. That may be one reason why Canon plays such games: their own batteries aren't as good as the best third party batteries. There are a lot of crappy third party batteries too, though.


3 Answers 3

I do need to add both post_meta:

And the rest of the answer "should" be covered by wp_set_object_terms:

However, this function is not fully available at the point I need it - so the answer was to create a simple replacement for this function:

Which I can call using a static class instance ( as the method is part of a separate class in my case. ):

I've found that the update_field() function actually does the job for me, I don't know if it's been updated since to make that happen. All you have to watch out for is if you have a single choice (select) field or one that allows multiple selections - you need to pass through an array of taxonomy ID's if it's multiples, basically.

The update_field() function will handle the serialize for you automatically based on the field settings, so you don't have to worry about anything else.

I've used this method to update both single and multiple taxonomy fields on the same post record, and I've tested that I can then query those results using the ACF-recommended ways of doing so, so it's definitely putting the data in correctly.

EDIT: one thing to bear in mind, you'll notice I've turned the term_id into a string by wrapping it in quotes - ACF stores the IDs as strings in serialized arrays, so this is essential or their suggested ways to query this data won't work (and when you go to end the field through wp-admin in the future, it will change it back to a string anyway).


4 Answers 4

You're probably cutting and pasting the command (or parts of it) from a document instead of typing it in manually. Usually this doesn't make any difference, but in this case, the second quote character was inserted as a "right single quotation mark" ( ’ ) instead of an "apostrophe" ( ' ). The difference is subtle -- see this page for more details:

The reason this probably happened is that when you first typed in the command to the document to save it for future reference, your word processor automatically converted the second apostrophe into a right single quotation mark. It does this to make the character look nicer on the screen, but bash doesn't recognize this character as a valid closing quote, so you run into the problem. It prints " > " to prompt for further input, because it still thinks the original quote has not been closed.

The fix is to change that character to an apostrophe -- just retype it manually into bash from the keyboard. And you can also correct it in your document so that future cut+pasting will work fine.


Watch the video: Postgresql: Create Postgis database and import shapefiles. (October 2021).