Showing posts with label maps. Show all posts
Showing posts with label maps. Show all posts

Monday, March 27, 2006

Public Access to Public Data

In a development that has far reaching implications for public access to publicly funded geodata, the Guardian reported last Thursday that Tim Berners-Lee has made a speech to an Oxford University audience in which he challenged the British government to make Ordinance Survey mapping data available at no cost for Web use and

may get his wish later this year. Sir Tim Berners-Lee told an Oxford University audience last week getting "basic, raw data from Ordnance Survey" online would help build the "semantic web", which he defines as a web of data using standard formats so that relevant data can be found and processed by computers.

Berners-Lee said it may be reasonable for OS, the premier state-owned supplier of public sector information, to continue to charge for its high-resolution mapping. But even if licences were required, he added, OS should make its data open to manipulation. "I want to do something with the data, I want to be able to join it with all my other data," he said. "I want to be able to do Google Maps things to a ridiculous extent, and not limited in the way that Google Maps is."

The guest lecturer said he had discussed this with OS. "They are certainly thinking about this and studying what they can do. OS is in favour of doing the right thing for the country, as well as maintaining its existence, so I think there's a fair chance we'll find mutual agreement."


This relates to a similarly controversial subject in my State and anywhere else in the United States where individual datasets for current county coverage can cost the purchaser thousands of dollars and be encumbered with copyright restrictions and in proprietary MrSID or ESRI formats. As someone else said:
In the United States there seem to be two contradictory trends in public access to public data. On the one hand, more public data than ever before is being published on the Internet for free download. On the other hand, many public agencies ignore laws guaranteeing public access to public data, or they are providing the data in a form that renders it unusable by the public.

Roger Longhorn, Info-Dynamics Research Associates Ltd points out that
It is important to remember that, in the USA, free (no cost) access to geodata applies only to federally collected (or paid for) data. State and local government, holders of vast quantities of geodata, can (and some do) charge for access and/or exploitation of these important, typically large scale, geodata resources.

Local governments charge fees at levels that discourage innovation, throttle data dissemination, skew distribution and discourage data reuse. I don't mind paying a reasonable fee and I truly don't mind local governments recouping reasonable cost. Hundreds of dollars and in some cases, thousands of dollars, per data set is not reasonable. Consider that these same agencies and districts would have to provide this data at the cost of copying it to CD's if requested under their freedom of information requirements. The difference in charges is for timely delivery and the substantial benefits that derive from being a team player.

Yet it is unseemly for us in the United States to complain. Our nation's history supports the basic premise that "one of the reasons to have a government is to have good map data" available to the public. Post-9/11 security concerns have clouded the issue but (as analyzed in this pdf)have not changed the fundamentals.

Rapid developments in the UK and UE will encourage those in the USA working to make publicly funded data more freely available, and less encumbered with restrictive copyrights and proprietary formats. What goes around, comes around.

Complementing open geodata efforts is the open source geospatial technologies movement. The newly formed Open Source Geospatial Foundation (discussed here, here and here) will develop the standards needed for open source to advance. I hope both movements, open source and open data, do well. On both sides of the Atlantic.


Tech Tags:

Wednesday, February 01, 2006

Precise common sense II

Elton Robinson expands nicely on the previous post by email:

The variable-rate application of inputs is actually well developed and prospering in Mid-South cotton fields. It works for two reasons. One, we have highly variable soils along the Mississippi River Delta, which in turn creates variable yields. Second, the cotton crop demands intense in-season management for plant growth, insects, weed management, disease and harvest preparation.

Infrared aerial photography and electrical conductivity mapping carts can pick up the variation in soil type when the ground is bare and pick up plant biomass when the crop is growing. Geo-referenced maps generated from the imagery allow the farmer to vary applications of plant growth regulator, defoliants and other inputs during the season based on variability in biomass. For example, the poor-yielding parts of the field will receive less plant growth regulator to allow plants to catch up with the better-yielding parts of the field, which in turn will receiver more plant growth regulator, to prevent vegetative growth. The result is higher yield and lower cost.

The cost to the farmer for the imagery, and variable-rate prescription is $7 per acre. Sprayers can be adapted for variable rate applications for $6,000. The cost of producing cotton is about $500 an acre. A conservative savings in input costs of 10 percent plus a 5 percent increase in yield would put $65 an acre in the farmer’s pocket. If he farms 1,000 acres of cotton, that $65,000, more than enough to pay off the cost of the technology in year one.

The technology is not affordable if there is little variability in the soil, or if a crop (corn, soybeans) does not respond as well to in-season management. I did read your previous blog on VR nitrogen, and agree that it's been very difficult for researchers to show a benefit.

Tuesday, January 31, 2006

Precise common sense

Precision ag implies computer mapped lab data and GPS controlled field equipment. Higher yields, less flying blind and easier farming. The reality is that the expense of data collection, analysis and interpretation can quickly wipeout any added value. Reading this article about variable rate management of cotton, it struck me that common sense and curiosity are the missing ingredients. Elton Robinson with Delta Press reports on cotton producer Kenneth Hood, Mississippi, who attributes his success with variable rate agriculture to, among other things, reliance on aerial photo interpretation, an approach not typical of precision agriculture. Hood says that the “... advantage to imagery is that very little data collection is required, according to Hood, “which is unlike most precision agriculture practices.” Put this experience together with the recent cryptic news on the lukewarm record of precision agriculture in Germany, which I touched on earlier, and what do you get? My sense is that Kenneth Hood is going to have lots of company.