[ Index ]
Aleksander Lodwich (aleksander[/a/t/]lodwich.net)
Scientific Agenda for Great Social Innovation
The concept of a cyber-physical society is quite similarly motivated as the Venus Project but is significantly more strongly focussing on the aspects of communication and organizing human interactions with new kinds of technology. It is less concerned with technology used to build cradle-to-cradle environments, but by pursuing technology for self-integrating systems I hope to be able to contribute to highly automated cities and industries.
My scientific agenda encompasses several areas of research and publication which shall contribute to a final goal of achieving a turning point in politics, society and economy which I call the Great Social Innovation (GSI). After GSI, society has characteristics of a decentral, close to anarchistic structure.
Since even high levels of automation do not prevent humans from interacting with humans and because humans are embedded in technical systems, some sanctioning and rules must exist in order to guarantee satisfying interactions but that does not imply a concept of law or central enforcement authorities.Cyber-physical societies (and natural societies) rely on knowledge-based case reasoning. As it often is, this requires more complex understanding of the situation, domain and justification of motivations. In a cyber-physical society, rules are derived for the qualifications and the situation using technology. This is the closest to "law" I can imagine. Current technology is very limited to support this kind of practice and must be further developed.
The backbone of a cyber-physical society is systematic collection, application and verification of knowledge to technical and societal systems.
- Groups are motivated to explicitly formulate and research their social concepts in order to attain two goals: Rules are effective and individuals can overthrow rules of the group formally by improving knowledge.This shall prevent stupid rule of majority. Improvements to knowledge must follow certain basic rules such as reduction of exceptions and improvement in generality, modularity etc. This improves consistency and universality of rules and reduces the ability of manipulative groups to take harmful control over the group by framing the situation in their terms. Such groups must provide better knowledge than existed which is not the case in my hypothesis.
- In order to do so effectively, new cultural techniques and technological artifacts are necessaray to support decentralised, all-life encompassing knowledge collection and application system. Such technologies have not been developed yet, as current technology is mainly designed to support hierarchical systems of authority - technology reflects and supports already existing schemas of power.
Even if knowledge gets superceeded by newer knowledge ever sooner, improved knowledge usually relies on more data and more powerful theories to explain and predict reality. Since in a cyber-physical society rules are products which must be developed in accordance with existing knowledge as constraints and as requirements, the degrees of freedom to do so are continuously removed as optimization goes on. Reduction of degrees of freedom in social system design continuously reduces the potential of influence from archaic ideologies and antique religious concepts. This is what is meant by the statement that natural law overcomes positive law. Problems with positive law mainly are introduced by wishful thinking and relict morals measuring only to low standards. The more degrees of freedom exist in the process of social engineering, the more politicial activities can be effective to deter success in design of a peaceful, harmonic society.
In a cyber-physical society there is no need to have states with governments encompassing all aspects of infrastructure. Society is designed democratically by sophisticated methodology which is applicable to social design and technology design alike (this guarantees enough common experience with the involved methodology). There are several layers of virtual rooms to manage physical resources like with virtual operating systems. Concurrency is effectively possible. Groups can share knowledge and systems like we know from computer software. The concept of a state citizen is gone and states can be dissolved as they are honey pots to power abuse. Areas connect and organize themselves in the organic way we know from other natural systems. Humans use one to many user accounts in the decentralized digital infrastructure I strive for in order to participate. There is no central authorities who grant identity or who have to monitor them.
In such society there is a shift of power away from people who are good at talking and manipulating people in public to knowledge engineers and scientists. Occupations like politicians or lawyers will slowly disappear from the screen and their power to influence policies and to do policy interpretation. Since a core feature of cyber-physical society is to eliminate capital from its operation, risks from influence by capital on societal development is greatly diminished. I write this despite that I know that the average person does not understand what capital is and that it will belive that this means elimination of wealth. But this is far from true as the nature of capital is something else: Capital is an exlusive option which can be used to extort an unproportionate promise from others - A condition which is working to concentrate all economical and political powers in hands of few poles. In capitalistic societies all judicial concepts are designed to be portable ("tradeable") and act as means to cheaply inhibit others' abílity to act. This "feature" of capitalistic societies attracts only a special kind of (e.g. emotionally defective) people into its center which need only to follow primitve strategies of power consolidation in order to deprive all other people from their rightful options to act and decide. This power consolidation via capital mechanisms are greatly eliminated in a cyber-physical society and new instruments are introduced (real economy is sense of the word) which truly serve the process of wealth production without pronounced capital concentrations. Cyber-physical systems will also greatly eliminate waste of resources on weapons production and greatly reduce production of luxury goods before more basic societal priorities have been satisfied.
The digital knowledge is designed in a democratic way by people who absolve training as social engineers - a new main goal of education is to enable people to acquire necessary competence and skills for this. Because digitized knowledge about humans, their environment and successful techniques of organization and communication is explicitly formulated and its application is converted into technology all the time, technology and best social practice are coming together ever closer. If technology can support sophisticated forms of societal organization and operation then a fractal organization of society is to be expected. This in turn allows to test and validate core features of societal function by the largest amount of people. This is quite in contrast to today where state constitutions play only a marginal role in state operations (or any group operations) and hence can be easily ignored by its governments (as it is the case most of the time).
The cyber-physical society follows a pan-humanistic approach and can accomodate non-human entities in its concepts.
The cyber-physical society is not utopia but people need to be convinced on following grounds:
- What is the necessary technology and can it be built?
- How can people be sanctioned in known justified cases?
- Does this system really match human mental needs?
- Does it really reduce conflicts about resources?
My research is aiming at answering above questions:
So long the statements are hypothesis under research.
- That technology can be built and applied to run distributed, self-managed groups effectively and peacefully
- That people with abnormal characteristics can be dealt with appropriately
- That humans minds rely on concepts of their overall goal to attain system autonomy and that no magical external references are needed in order to understand the need of humans, animals and other intelligent creatures. Knowledge of mind functions is important key element in assessing efficacy of solutions and making theoretical justification of solutions.
- That successful application of knowledge, economical principles and advancing technology even further does not increase social stress but actually reduces it.
Thoughts on Autonomous Systems
Exploring high-level Perspectives on Self-Configuration Capabilities of Systems, 2016
, June 29th, 2016
Optimization of product performance
repetitively introduces the need to make products adaptive
in a more general sense. This more general idea is often
captured under the term “self-configuration”. Despite the
importance of such capability, research work on this feature
appears isolated by technical domains. It is not easy to tell
quickly whether the approaches chosen in different
technological domains introduce new ideas or whether the
differences just reflect domain idiosyncrasies. For the sake
of easy identification of key differences between systems
with self-configuring capabilities, I will explore higher level
concepts for understanding self-configuration, such as the Ω-units, in order to provide theoretical instruments for
connecting different areas of technology and research.
also available on arxiv.org
also available on researchgate.net
Differences between Industrial Models of Autonomy and Systemic Models of Autonomy, 2016
, May 25th 2016
This paper discusses the idea of levels of autonomy of systems – be this technical or organic – and
compares the insights with models employed by industries used to describe maturity and capability of their products.
also available on arxiv.org
also available on researchgate.net
Understanding Error Correction and its Role as Part of the Communication Channel in Environments composed of Self-Integrating Systems, 2016
, December 20th, 2016
The raise of complexity of technical systems also raises knowledge required to set them up and to maintain them. The cost to evolve such systems can be prohibitive. In the field of Autonomic Computing, technical systems should therefore have various self-healing capabilities allowing system owners to provide only partial, potentially inconsistent updates of the system. The self-healing or self-integrating system shall find out the remaining changes to communications and functionalities in order to accommodate change and yet still restore function. This issue becomes even more interesting in context of Internet of Things and Industrial Internet where previously unexpected device combinations can be assembled in order to provide a surprising new function. In order to pursue higher levels of self-integration capabilities I propose to think of self-integration as sophisticated error correcting communications. Therefore, this paper discusses an extended scope of error correction with the purpose to emphasize error correction’s role as an integrated element of bi-directional communication channels in self-integrating, autonomic communication scenarios.
Research on interoperability within development processes of Embedded Systems on an example, 2014
- A concept for tackling Frontloading in Model-based Engineering with AUTOSAR -
Master Thesis of Ferdinand Schäfer, University of Applied Sciences Karlsruhe,
buy and read
OFISS: An Organic End-user-programmable Data-centric User Interface Framework as Frontend to Commandline-based Applications, 2010
Bachelor Thesis of Christopher Schölzel, TU Kaiserslautern, Prof. Dr. Breuel
Many tools have made Graphical User Interface (GUI) programming easier for developers, but most of these tools do not care too
much about issues of transparency and extensibility on the toolkit-level, leaving it in the hands of developers to provide customization support such
as the possibility to create macros inside the applications themselves.
Applications written in the framework introduced in this thesis address this issue by taking the UNIX-paradigm "Everything is a file." to a new level.
Each screen object and each attribute of these objects is represented by files and folders on the filesystem and therefore accessible and
customizable for everybody, including the end-user.
Such a system might be of great use in both professional and personal settings.
Model-Based Engineering for Embedded Systems in Practice, 2014
Research Reports in Software Engineering and Management 2014:01 ISSN 1654-4870
Nadja Marko, Grischa Liebel, Daniel Sauter, Aleksander Lodwich, Matthias Tichy, Andrea Leitner, and Jörgen Hansson
Model-Based Engineering (MBE) aims at increasing the effectiveness
of engineering by using models as key artifacts in the development process.
While empirical studies on the use and the effects of MBE in industry
generally exist, there is only little work targeting the embedded systems
domain. We contribute to the body of knowledge with a study on the
use and the assessment of MBE in that particular domain. Therefore, we
collected quantitative data from 112 subjects, mostly professionals working
with MBE, with the goal to assess the current State of Practice and the
challenges the embedded systems domain is facing. Of the 112 subjects,
the majority are experienced with MBE, working at large companies in
the automotive, avionics, or healthcare domains. Additionally, mainly
OEMs and First-tier suppliers are represented in the study. Our main
findings are that MBE is used by a majority of all participants in the
embedded systems domain, mainly for simulation, code generation, and
documentation. Reported positive effects of MBE are higher quality and
improved reusability. Main shortcomings are interoperability difficulties
between MBE tools, high training effort for developers and usability
issues. The data also shows that there are no large differences between
subgroups with respect to domains, position in the value chain, company
size and product size.
Beyond Interoperability in the Systems Engineering Process for the Industry 4.0 (To be published March 2017).
Lodwich, A.; and Alvarez-Rodríguez, J. M. pages 161-182. Springer Verlag on Intelligent Systems Reference Library, 2016.
Future complex products will be different from existing ones in several relevant ways. They will be more intelligent and connected and they will have
to be greatly leaner across software and hardware in order to handle safety, security and resource demand.
Industrial Internet, Industry 4.0 and Internet of Things will greatly shift responsibility for products away from engineering
departments towards the actual environment in which the products are employed.
This situation will eventually transform the most tested companies into intelligent building
platforms where the responsibility for designing, producing and delivering
will be distributed among market parties in unrecognizable ways. The benefits
of these upcoming changes will be higher utility of products for customers and new levels of production flexibility and efficiency.
However, this new environment can only be attained if developmental and operational platforms and embedded products can rely on
reusing the explicit knowledge used in their designs. The provision of technology for this new environment
goes far beyond asking for tools interoperability. In this chapter a conceptual layer of interoperability is outlined describing what kind of features a powerful new interoperability technology should support in order to fuel desired changes in engineering and production paradigms.
Note on Communication Incompatibility Types, 2016
November 18th, 2016
This note contains the description of eight basic communication incompatibility types
and basic strategies to combat them. The purpose of this note is to remind software and
system designers to design communication technologies which suffer as few reasonably
unresolvable incompatibilities as possible.
Bubbles: a data management approach to create an advanced industrial interoperability layer for critical systems development applying reuse techniques, 2016
July 28th, 2016
The development of critical systems is becoming more and more complex. The
overall tendency is that development costs raise. In order to cut cost of development,
companies are forced to build systems from proven components and larger new systems from
smaller older ones. Respective reuse activities involve good number of people, tools and
processes along different stages of the development lifecycle which involve large numbers of
tools. Some development is directly planned for reuse. Planned reuse implies excellent
knowledge management and firm governance of reusable items. According to the current
state of the art, there are still practical problems in the two fields, mainly because the
governance and knowledge management is fragmented over the tools of the toolchain. In our
experience, the practical effect of this fragmentation is that involved ancestor and derivation
relationships are often undocumented or not exploitable. Additionally, useful reuse is almost
always dealing with heterogeneous content which must be transferred from older to newer
development environments. In this process, interoperability proves either as biggest obstacle
or greatest help. In this paper, authors connect the topics interoperability and knowledge
management and propose to seek for ubiquitous reuse via advanced interoperability features.
A single concept from a larger Technical Interoperability Concept (TIC) with the name
bubble is presented. Bubbles are expected to overcome existing barriers to cost-efficient reuse
in systems and software development lifecycles. That is why, the present paper introduces
and defines bubbles by showing how they simplify application of repairs and changes and
hence contribute to expansion of reuse at reduced cost.
also available on arxiv.org
also available on researchgate.net
Pattern Recognition and Data Mining
Efficient Estimation of k for the Nearest Neighbors Class of Methods, unpublished, 2011
download and read
Aleksander Lodwich, Faisal Shafait, Thomas M. Breuel
The k Nearest Neighbors (kNN) method has received much attention in the past decades, where some theoretical bounds on its performance were identi ed and where practical optimizations were proposed for making it work fairly well in high dimensional spaces and on large datasets. From countless experiments of the past it became widely accepted that the value of k has a significant impact on the performance of this method. However, the efficient optimization of this parameter has not received so much attention in literature. Today, the most common approach is to cross-validate or bootstrap this value for all values in question. This approach forces distances to be recomputed many times, even if efficient methods are used. Hence, estimating the optimal k can become expensive even on modern systems. Frequently, this circumstance leads to a sparse manual search of k. In this paper we want to point out that a systematic and thorough estimation of the parameter k can be performed efficiently. The discussed approach relies on large matrices, but we want to argue, that in practice a higher space complexity is often much less of a problem than repetetive distance computations.
Report on Practical Bayes-True Data Generators For Evaluation of Machine Learning, Pattern Recognition and Data Mining Methods, technical report, 2009
download and read
Janick V. Frasch, Aleksander Lodwich, Thomas M. Breuel
Benchmarking pattern recognition, machine learning and data mining methods commonly relies on real-world data sets. However there exist a couple of reasons for not using real-world data. On one hand collecting real-world data can become difficult or impossible for many reasons, on the other hand real-world variables are difficult to control, even in the problem domain; in the feature domain, where most statistical learning methods operate, control is even more difficult to achieve and hence rarely attempted. This is at odds with the scienti c experimentation guidelines mandating the use of as directly controllable and as directly observable variables as possible. Because of this, synthetic data is a necessity for experiments with algorithms. In this report we present four algorithms that produce data with guaranteed global and intra-data statistical or geometrical properties. The data generators can be used for algorithm testing and fair performance evaluation of statistical learning methods.
Optimization of the Vagus Nerve Stimulation Parameters by the Means of Computational Intelligence, 2007
(original data can be requested from me)
related: Animal model of the short-term cardiorespiratory effects of intermittent vagus nerve stimulation, 2008
Optimization of the Vagus Nerve Stimulation Parameters with CI-Methods, unpublished, 2006
download and read
In order to improve the Vagus Nerve Stimulation (VNS)
statistical efficacy increase for certain parameter vectors
was investigated. Such vectors are used in real cases.
Although immediate tests on humans are not possible it
was possible to find rats as adequate substitute models.
The rats were stimulated with a protocol of 81 parameter
vectors. Their physical response was recorded. Each rat
is different but it is assumed that for some vectors similar
response will occur. If only enough rats were recorded
then a statistically safe prediction could be made about
what parameter vectors will result in effects mostly
desired by the therapist. It is the primary goal of the
project to find similarities in different response cases for
some of the typical vectors and eventually provide deeper
insights on the treatment. The project faces extremely
large data amounts from data acquisition. Methods of
Computational Intelligence (CI) are used for the
conversion of data to knowledge. After conventional data
pre-processing evolutionarily designed artificial neural
networks are used in order to evaluate available data and
to help medical staff verify their medical thesis.
Schluss mit Spam