Refine
Year of publication
- 2022 (200) (remove)
Document Type
- Conference Proceeding (97)
- Article (reviewed) (59)
- Master's Thesis (10)
- Bachelor Thesis (6)
- Contribution to a Periodical (6)
- Book (5)
- Article (unreviewed) (5)
- Part of a Book (3)
- Letter to Editor (3)
- Patent (3)
Conference Type
- Konferenzartikel (82)
- Konferenz-Abstract (11)
- Konferenz-Poster (2)
- Sonstiges (2)
Language
- English (200) (remove)
Keywords
- injury (10)
- biomechanics (7)
- running (7)
- 3D printing (6)
- Machine Learning (5)
- ACL (4)
- Robustness (4)
- machine learning (4)
- Radar (3)
- RoboCup (3)
Institute
- Fakultät Maschinenbau und Verfahrenstechnik (M+V) (77)
- Fakultät Elektrotechnik, Medizintechnik und Informatik (EMI) (ab 04/2019) (66)
- Fakultät Medien (M) (ab 22.04.2021) (31)
- Fakultät Wirtschaft (W) (29)
- INES - Institut für nachhaltige Energiesysteme (27)
- ivESK - Institut für verlässliche Embedded Systems und Kommunikationselektronik (15)
- POIM - Peter Osypka Institute of Medical Engineering (14)
- IMLA - Institute for Machine Learning and Analytics (13)
- ACI - Affective and Cognitive Institute (6)
- CRT - Campus Research & Transfer (3)
Open Access
- Open Access (112)
- Closed (66)
- Bronze (31)
- Diamond (25)
- Gold (23)
- Closed Access (22)
- Hybrid (14)
- Grün (6)
Generative machine learning models for creative purposes play an increasingly prominent role in the field of dance and technology. A particularly popular approach is the use of such models for generating synthetic motions. Such motions can either serve as source of ideation for choreographers or control an artificial dancer that acts as improvisation partner for human dancers. Several examples employ autoencoder-based deep-learning architectures that have been trained on motion capture recordings of human dancers. Synthetic motions are then generated by navigating the autoencoder's latent space. This paper proposes an alternative approach of using an autoencoder for creating synthetic motions. This approach controls the generation of synthetic motions on the level of the motion itself rather than its encoding. Two different methods are presented that follow this principle. Both methods are based on the interactive control of a single joint of an artificial dancer while the other joints remain under the control of the autoencoder. The first method combines the control of the orientation of a joint with iterative autoencoding. The second method combines the control of the target position of a joint with forward kinematics and the application of latent difference vectors. As illustrative example of an artistic application, this latter method is used for an artificial dancer that plays a digital instrument. The paper presents the implementation of these two methods and provides some preliminary results.
Privacy is the capacity to keep some things private despite their social repercussions. It relates to a person’s capacity to control the amount, time, and circumstances under which they disclose sensitive personal information, such as a person’s physiology, psychology, or intelligence. In the age of data exploitation, privacy has become even more crucial. Our privacy is now more threatened than it was 20 years ago, outside of science and technology, due to the way data and technology highly used. Both the kinds and amounts of information about us and the methods for tracking and identifying us have grown a lot in recent years. It is a known security concern that human and machine systems face privacy threats. There are various disagreements over privacy and security; every person and group has a unique perspective on how the two are related. Even though 79% of the study’s results showed that legal or compliance issues were more important, 53% of the survey team thought that privacy and security were two separate things. Data security and privacy are interconnected, despite their distinctions. Data security and data privacy are linked with each other; both are necessary for the other to exist. Data may be physically kept anywhere, on our computers or in the cloud, but only humans have authority over it. Machine learning has been used to solve the problem for our easy solution. We are linked to our data. Protect against attackers by protecting data, which also protects privacy. Attackers commonly utilize both mechanical systems and social engineering techniques to enter a target network. The vulnerability of this form of attack rests not only in the technology but also in the human users, making it extremely difficult to fight against. The best option to secure privacy is to combine humans and machines in the form of a Human Firewall and a Machine Firewall. A cryptographic route like Tor is a superior choice for discouraging attackers from trying to access our system and protecting the privacy of our data There is a case study of privacy and security issues in this thesis. The problems and different kinds of attacks on people and machines will then be briefly talked about. We will explain how Human Firewalls and machine learning on the Tor network protect our privacy from attacks such as social engineering and attacks on mechanical systems. As a real-world test, we will use genomic data to try out a privacy attack called the Membership Inference Attack (MIA). We’ll show Machine Firewall as a way to protect ourselves, and then we’ll use Differential Privacy (DP), which has already been done. We applied the method of Lasso and convolutional neural networks (CNN), which are both popular machine learning models, as the target models. Our findings demonstrate a logarithmic link between the desired model accuracy and the privacy budget.
We aim to debate and eventually be able to carefully judge how realistic the following statement of a young computer scientist is: “I would like to become an ethical correctly acting offensive cybersecurity expert”. The objective of this article is not to judge what is good and what is wrong behavior nor to present an overall solution to ethical dilemmas. Instead, the goal is to become aware of the various personal moral dilemmas a security expert may face during his work life. For this, a total of 14 cybersecurity students from HS Offenburg were asked to evaluate several case studies according to different ethical frameworks. The results and particularities are discussed, considering different ethical frameworks. We emphasize, that different ethical frameworks can lead to different preferred actions and that the moral understanding of the frameworks may differ even from student to student.
The isolation measures adopted during the COVID-19 pandemic brought light to discussions related to the importance of meaningful social relationships as a basic need to human well-being. But even before the pandemic outbreak in the years 2020 and 2021, organizations and scholars were already drawing attention to the growing numbers related to lonely people in the world (World Economic Forum, 2019). Loneliness is an emotional distress caused by the lack of meaningful social connections, which affects people worldwide across all age groups, mainly young adults (Rook, 1984). The use of digital technologies has gained prominence as a means of alleviating the distress. As an example, studies have shown the benefits of using digital games both to stimulate social interactions (Steinfield, Ellison & Lampe, 2008) and to enhance the effects of digital interventions for mental health treatments, through gamification (Fleming et al., 2017). It is with these aspects in mind that the gamified app Noneliness was designed with the intention of reducing loneliness rates among young students at a German university. In addition to sharing the related works that supported the application development, this chapter also presents the aspects considered for the resource's design, its main functionalities, and the preliminary results related to the reduction of loneliness in the target audience.
Organizations striving to achieve success in the long term must have a positive brand image which will have direct implications on the business. In the face of the rising cyber threats and intense competition, maintaining a threat-free domain is an important aspect of preserving that image in today's internet world. Domain names are often near-synonyms for brand names for numerous companies. There are likely thousands of domains that try to impersonate the big companies in a bid to trap unsuspecting users, usually falling prey to attacks such as phishing or watering hole. Because domain names are important for organizations for running their business online, they are also particularly vulnerable to misuse by malicious actors. So, how can you ensure that your domain name is protected while still protecting your brand identity? Brand Monitoring, for example, may assist. The term "Brand Monitoring" applies only to keep tabs on an organization's brand performance, reception, and overall online presence through various online channels and platforms [1]. There has been a rise in the need of maintaining one's domain clear of any linkages to malicious activities as the threat environment has expanded. Since attackers are targeting domain names of organizations and luring unsuspecting users to visit malicious websites, domain monitoring becomes an important aspect. Another important aspect of brand abuse is how attackers leverage brand logos in creating fake and phishing web pages. In this Master Thesis, we try to solve the problem of classification of impersonated domains using rule-based and machine learning algorithms and automation of domain monitoring. We first use a rule-based classifier and Machine Learning algorithms to classify the domains gathered into two buckets – "Parked" and "Non-Parked". In the project's second phase, we will deploy object detection models (Scale Invariant Feature Transform - SIFT and Multi-Template Matching – MTM) to detect brand logos from the domains of interest.
Even though the internet has only been there for a short period, it has grown tremendously. To- day, a significant portion of commerce is conducted entirely online because of increased inter- net users and technological advancements in web construction. Additionally, cyberattacks and threats have expanded significantly, leading to financial losses, privacy breaches, identity theft, a decrease in customers’ confidence in online banking and e-commerce, and a decrease in brand reputation and trust. When an attacker pretends to be a genuine and trustworthy institution, they can steal private and confidential information from a victim. Aside from that, phishing has been an ongoing issue for a long time. Billions of dollars have been shed on the global economy. In recent years, there has been significant progress in the development of phishing detection and identification systems to protect against phishing attacks. Phishing detection technologies frequently produce binary results, i.e., whether a phishing attempt was made or not, with no explanation. On the other hand, phishing identification methodologies identify phishing web- pages by visually comparing webpages with predetermined authentic references and reporting phishing together with its target brand, resulting in findings that are understandable. However, technical difficulties in the field of visual analysis limit the applicability of currently available solutions, preventing them from being both effective (with high accuracy) and efficient (with little runtime overhead). Here, we evaluate existed framework called Phishpedia. This hybrid deep learning system can recognize identity logos from webpage screenshots and match logo variants of the same brand with high precision. Phishpedia provides high accuracy with low run- time. Lastly, unlike other methods, Phishpedia does not require training on any phishing sam- ples whatsoever. Phishpedia exceeds baseline identification techniques (EMD, PhishZoo, and LogoSENSE), inaccurately detecting phishing pages in lengthy testing using accurate phishing data. The effectiveness of Phishpedia was tested and compared against other standard machine learning algorithms and some state-of-the-art algorithms. The given solutions performed better than different algorithms in the given dataset, which is impressive.
Technology advancement has played a vital role in business development; however, it has opened a broad attack surface. Passwords are one of the essential concepts used in applications for authentication. Companies manage many corporate applications, so the employees must meet the password criteria, which leads to password fatigue. This thesis addressed this issue and how we can overcome this problem by theoretically implementing an IAM solution. In this, we disused MFA, SSO, biometrics, strong password policies and access control. We introduced the IAM framework that should be considered while implementing the IAM solution. Implementing an IAM solution adds an extra layer of security.
Server Side Rendering (SSR), Single Page Application (SPA), and Static Site Generation (SSG) are the three most popular ways of making modern Web applications today. If we go deep into these processes, this can be helpful for the developers and clients. Developers benefit since they do not need to learn other programming languages and can instead utilize their own experience to build different kinds of Web applications; for example, a developer can use only JavaScript in the three approaches. On the other hand, clients can give their users a better experience.
This Master Thesis’s purpose was to compare these processes with a demo application for each and give users a solid understanding of which process they should follow. We discussed the step-by-step process of making three applications in the above mentioned categories. Then we compared those based on criteria such as performance, security, Search Engine Optimization, developer preference, learning curve, content and purpose of the Web, user interface, and user experience. It also talked about the technologies such as JavaScript, React, Node.js, and Next.js, and why and where to use them. The goals we specified before the program creation were fulfilled and can be validated by comparing the solutions we gave for user problems, which was the application’s primary purpose.
Editorial
(2022)
Editorial
(2022)
Editorial
(2022)
Editorial
(2022)
Editorial
(2022)
Editorial
(2022)
Sweaty has already participated several times in RoboCup soccer competitions (Adult Size). Now the work is focused on stabilizing the gait. Moreover, we would like to overcome the constraints of a ZMP-algorithm that has a horizontal footplate as precondition for the simplification of the equations. In addition we would like to switch between impedance and position control with a fuzzy-like algorithm that might help to minimize jerks when Sweaty’s feet touch the ground.
Decarbonisation Strategies in Energy Systems Modelling: Biochar as a Carbon Capture Technology
(2022)
The energy system is changing since some years in order to achieve the climate goals from the Paris Agreement which wants to prevent an increase of the global temperature above 2 °C. Decarbonisation of the energy system has become for governments a big challenge and different strategies are being stablished. Germany has set greenhouse gas reduction limits for different years and keeps track of the improvement made yearly. The expansion of renewable energy systems (RES) together with decarbonisation technologies are a key factor to accomplish this objective.
This research is done to analyse the effect of introducing biochar, a decarbonisation technology, and study how it will affect the energy system. Pyrolysis is the process from which biochar is obtained and it is modelled in an open-source energy system model. A sensibility analysis is made in order to assess the effect of changing the biomass potential and the costs for pyrolysis.
The role of pyrolysis is analysed in the form of different future scenarios to evaluate the impact. The CO2 emission limits from the years 2030 and 2045 are considered to create the scenarios, as well as the integration of flexibility technologies. Four scenarios in total are assessed and the result from the sensibility analysis considering pyrolysis are always compared to the reference scenario, where pyrolysis is not considered.
Results show that pyrolysis has a bigger impact in the energy system when the CO2 limit is low. Biochar can be used to compensate the emissions from other conventional power plant and achieve an energy transition with lower costs. Furthermore, it was also found that pyrolysis can also reduce the need of flexibility. This study also shows that the biomass potential and the pyrolysis costs can affect a lot the behaviour of pyrolysis in the energy system.
The tenth edition of the successful report "Project Management Software Systems" provided the complete guide to a successful project management software selection program. It includes an extensive overview of the leading products on the market. If you are seeking to purchase project (portfolio) management software for your organization, this report from BARC and GPM puts the facts at your fingertips to help you select the best tool to match your requirements.
Among the many highlights of this comprehensive report, you will discover
- the critical success factors in software selection processes,
- the phases of a systematic software selection process,
- basics on software architecture regarding modern PM software, and
- descriptions of all the functions you can except from today's PM software tools.
The second section contains a detailed analysis of market-leading products based on over 300 criteria. Each product reviewed in this report is assessed based on the same criteria so that product comparisons can be made easily.
On a regular basis, we hear of well-known online services that have been abused or compromised as a result of data theft. Because insecure applications jeopardize users' privacy as well as the reputation of corporations and organizations, they must be effectively secured from the outset of the development process. The limited expertise and experience of involved parties, such as web developers, is frequently cited as a cause of risky programs. Consequently, they rarely have a full picture of the security-related decisions that must be made, nor do they understand how these decisions affect implementation accurately.
The selection of tools and procedures that can best assist a certain situation in order to protect an application against vulnerabilities is a critical decision. Regardless of the level of security that results from adhering to security standards, these factors inadvertently result in web applications that are insufficiently secured. JavaScript is a language that is heavily relied on as a mainstream programming language for web applications with several new JavaScript frameworks being released every year.
JavaScript is used on both the server-side in web applications development and the client-side in web browsers as well.
However, JavaScript web programming is based on a programming style in which the application developer can, and frequently must, automatically integrate various bits of code from third parties. This potent combination has resulted in a situation today where security issues are frequently exploited. These vulnerabilities can compromise an entire server if left unchecked. Even though there are numerous ad hoc security solutions for web browsers, client-side attacks are also popular. The issue is significantly worse on the server side because the security technologies available for server-side JavaScript application frameworks are nearly non-existent.
Consequently, this thesis focuses on the server-side aspect of JavaScript; the development and evaluation of robust server-side security technologies for JavaScript web applications. There is a clear need for robust security technologies and security best practices in server-side JavaScript that allow fine-grained security.
However, more than ever, there is this requirement of reducing the associated risks without hindering the web application in its functionality.
This is the problem that will be tackled in this thesis: the development of secure security practices and robust security technologies for JavaScript web applications, specifically, on the server-side, that offer adequate security guarantees without putting too many constraints on their functionality.
As information technology continues to advance at a rapid speed around the world, new difficulties emerge. The growing number of organizational vulnerabilities is among the most important issues. Finding and mitigating vulnerabilities is critical in order to protect an organization’s environment from multiple attack vectors.
The study investigates and comprehends the complete vulnerability management process from the standpoint of the security officer job role, as well as potential improvements. Few strategies are used to achieve efficient mitigation and the de- velopment of a process for tracking and mitigating vulnerabilities. As a result, a qualitative study is conducted in which the objective is to create a proposed vulner- ability and risk management process, as well as to develop a system for analyzing and tracking vulnerabilities and presenting the vulnerabilities in a graphical dash- board format. This thesis’s data was gathered through an organized literature study as well as through the use of various web resources. We explored numerous ap- proaches to analyze the data, such as categorizing the vulnerabilities every 30, 60, and 90 days to see whether the vulnerabilities were reoccurring or new. According to our findings, tracking vulnerabilities can be advantageous for a security officer.
We come to the conclusion that if an organization has a proper vulnerability tracking system and vulnerability management process, it can aid security officers in having a better understanding of and making plans for reducing vulnerabilities. In terms of system patching and vulnerability remediation, it will also assist the security officer in identifying areas of weakness in the process. As a result, the suggested ways provide an alternate approach to managing and tracking vulnerabilities in an effective manner, although there is still a small area that needs additional analysis and research to make it even better.
BACKGROUND
Various neutral and alkaline peptidases are commercially available for use in protein hydrolysis under neutral to alkaline conditions. However, the hydrolysis of proteins under acidic conditions by applying fungal aspartic peptidases (FAPs) has not been investigated in depth so far. The aim of this study, thus, was to purify a FAP from the commercial enzyme preparation, ROHALASE® BXL, determine its biochemical characteristics, and investigate its application for the hydrolysis of food and animal feed proteins under acidic conditions.
RESULTS
A Trichoderma reesei derived FAP, with an apparent molecular mass of 45.8 kDa (sodium dodecyl sulfate–polyacrylamide gel electrophoresis; SDS-PAGE) was purified 13.8-fold with a yield of 37% from ROHALASE® BXL. The FAP was identified as an aspartate protease (UniProt ID: G0R8T0) by inhibition and nano-LC-ESI-MS/MS studies. The FAP showed the highest activity at 50°C and pH 4.0. Monovalent cations, organic solvents, and reducing agents were tolerated well by the FAP. The FAP underwent an apparent competitive product inhibition by soy protein hydrolysate and whey protein hydrolysate with apparent Ki-values of 1.75 and 30.2 mg*mL−1, respectively. The FAP showed promising results in food (soy protein isolate and whey protein isolate) and animal feed protein hydrolyses. For the latter, an increase in the soluble protein content of 109% was noted after 30 min.
CONCLUSION
Our results demonstrate the applicability of fungal aspartic endopeptidases in the food and animal feed industry. Efficient protein hydrolysis of industrially relevant substrates such as acidic whey or animal feed proteins could be conducted by applying fungal aspartic peptidases. © 2022 Society of Chemical Industry.
Significant improvements in module performance are possible via implementation of multi-wire electrodes. This is economically sound as long as the mechanical yield of the production is maintained. While flat ribbons have a relatively large contact area to exert forces onto the solar cell, wires with round cross section reduce this contact area considerably – in theory to an infinitively thin line. Therefore, the local stresses induced by the electrodes might increase to a point that mechanical production yields suffer unacceptably.
In this paper, we assess this issue by an analytical mechanical model as well as experiments with an encapsulant-free N.I.C.E. test setup. From these, we can derive estimations for the relationship between lay-up accuracy and expected breakage losses. This paves the way for cost-optimized choices of handling equipment in industrial N.I.C.E.-wire production lines.
Micronization of biochar (BC) may ease its application in agriculture. For example, fine biochar powders can be applied as suspensions via drip-irrigation systems or can be used to produce grnulated fertilizers. However, micronization may effect important physical biochar properties like the water holding capacity (WHC) or the porosity.
The majority of anterior cruciate ligament (ACL) injuries in team sports are non-contact injuries, with cutting maneuvers identified as high-risk tasks. Young female handball players have been shown to be at greater risk for ACL injuries than males. One risk factor for ACL injuries is the magnitude of the knee abduction moment (KAM). Cutting technique variables on foot placement, overall approach and knee kinematics have been shown to influence the KAM. Since injury risk is believed to increase with increasing task complexity, the purpose of the study was to test the effect of task complexity on technique variables that influence the KAM in female handball players during fake-and-cut tasks.
Every new technology is used by us humans almost without hesitation. Usually the military use comes first. Examples from recent history are the use of chemical weapons by Germany in the First World War and of atomic bombs in the Second World War by the US. Now, with the rapid advances in microelectronics over the past few decades, a wave of its application, called digitization, is spreading around the world with barely any control mechanisms. In many areas this has simplified and enriched our lives, but it has also encouraged abuse. The adaptation of legislation to contain the obvious excesses of “digitization” such as hate mail and anonymous threats is lagging behind massively. We hear almost nothing about technology assessment through systematic research; it is demanded at most by a few, usually small groups in civil society, which draw attention to the threats to humankind—future and present—and the Earth's ecosystem. One such group, the Federation of German Scientists (VDW) e.V., in the spirit of the responsibility of science for the peaceful and considered application of the possibilities it creates, asked three of its study groups to jointly organize its 2019 Annual Conference. The study groups “Health in Social Change,” “Education and Digitization,” and “Technology Assessment of Digitization” formulated the following position paper for the 2019 VDW Annual Conference, entitled “Ambivalences of the Digital.”
To deal with frequent power outages in developing countries, people turn to solutions like uninterruptible power supply (UPS), which stores electric energy during normal operating hours and use it to meet energy needs during rolling blackout intervals. Locally produced UPSs of poorer power quality are widely accessible in the marketplaces, and they have a negative impact on power quality. The charging and discharging of the batteries in these UPSs generate significant amount of power losses in weak grid environments. The Smart-UPS is our proposed smart energy metering (SEM) solution for low voltage consumers that is provided by the distribution company. It does not require batteries, therefore there is no power loss or harmonic distortion due to corresponding charging and discharging. Through load flow and harmonic analysis of both traditional UPS and Smart-UPS systems on ETAP, this paper examines their impact on the harmonics and stability of the distribution grid. The simulation results demonstrate that Smart-UPS can assist fixing power quality issues in a developing country like Pakistan by providing cleaner energy than the battery-operated traditional UPSs.
Convolutional neural networks (CNN) define the state-of-the-art solution on many perceptual tasks. However, current CNN approaches largely remain vulnerable against adversarial perturbations of the input that have been crafted specifically to fool the system while being quasi-imperceptible to the human eye. In recent years, various approaches have been proposed to defend CNNs against such attacks, for example by model hardening or by adding explicit defence mechanisms. Thereby, a small “detector” is included in the network and trained on the binary classification task of distinguishing genuine data from data containing adversarial perturbations. In this work, we propose a simple and light-weight detector, which leverages recent findings on the relation between networks’ local intrinsic dimensionality (LID) and adversarial attacks. Based on a re-interpretation of the LID measure and several simple adaptations, we surpass the state-of-the-art on adversarial detection by a significant margin and reach almost perfect results in terms of F1-score for several networks and datasets. Sources available at: https://github.com/adverML/multiLID
TRIZ Inventive Principles
(2022)
The analysis of several thousand patents led to the conclusion that inventive engineering problems and technical contradictions in all kinds of industrial sectors could be solved by a limited number of basic Inventive Principles (Altshuller, 1984). The modern Theory of Inventive Problem Solving TRIZ (VDI 4521) contains 40 basic Inventive Principles (IP). These principles are simple to use or modify and can be easily integrated in brainstorming or daily engineer’s work. One established part of industrial practice is the composition of the specific groups of principles for solving different kinds of problems (Livotov, Petrov, 2011). Based on interdisciplinary experience of TRIZ application in the industrial companies in the last 25 years the a general order in the application of 40 Inventive Principles can be recommended for idea generation and problem solving (Livotov, Chandra, Mas'udah et al, 2019). This brochure presents an update of the 40 Inventive Principles extending the original version (Altshuller, 1984) with additional 70 sub-principles, resulting in the advanced set of 160 sub-principles, regarded as elementary inventive operators. These extended version of inventive principles finds its application in the AIDA Automatic IDEA & IP Generator https://www.tris-europe.com/eng/software/innovationssoftware.htm
VR-based implementation of interactive laboratory experiments in optics and photonics education
(2022)
Within the framework of a developed blended learning concept, a lot of experience has already been gained with a mixture of theoretical lectures and hands-on activities, combined with the advantages of modern digital media. Here, visualizations using videos, animations and augmented reality have proven to be effective tools to convey learning content in a sustainable way. In the next step, ideas and concepts were developed to implement hands-on laboratory experiments in a virtual environment. The main focus is on the realization of virtual experiments and environments that give the students a deep insight into selected subfields of optics and photonics.
DE\GLOBALIZE
(2022)
The artistic research cycle DE\GLOBALIZE is a media ecological search movement for the terrestrial. After examining matters of fact in India (2014-18), matters of concern in Egypt (2016-2019) and matters of care in the Upper Rhine (2018-22), the focus turns toward matters of violence in the Congo (2022). From matter to mater, mother-earth, the garden to exploitation. From science, water and climate to migration, oppression and extermination.
The long-term research is accessible through interactive web documentation. The platform serves as a continuous media-archaeological archive for a speculative ethnography. The relational structure of the videographic essay is enabling the forensic processing of single documents in the sense of the actor-network theory.
The subject of the presentation at IFM is a field trip to the Congo planned for March 2022, which will focus on the ambivalence of violence and care in collaboration with local artists. The field trip is based on the postcolonial reflection luderitzcargo by the author from 1996, in which a freight container was transformed into a translocal cinema in Namibia.
Through the journey to Congo, a group of media artists, a psychotherapist, a theater dramaturg, a filmmaker and a philosopher intend to explore the political, technological and psycho-geographic borders. By artistic interventions with locals, we want to interfere with relational string figures as part of the new Earth Politics. They are focusing on the displaced consumption of resources which are hard-fought and guarantee prosperity in the global north. The so-called ghost acreages are repressed and justified as part of a civilizational mission. With this trip, we want to confront our self-lies with the ones of our hosts. We want to confront ourselves with the foreign, the dark and the displaced ghosts within ourselves. In the presentation at the #IFM2022 Conference, the platform DE\GLOBALIZE will be problematized itself as an example of epistemic violence for the ethnographic memory of (Western) knowledge.
We are not the missionaries but the perplexed travellers. In our search movement, we are dealing with psychoanalysis, video, performance and trance. As disoriented white men we try the reversal of Black Skin and White Mask by Franz Fanon without blackfacing. We will not only care about the sensitivity of our skin but that of our g/hosts and the one of mother earth.
A circuit arrangement of a motor vehicle includes a high-voltage battery for storing electrical energy, an electric machine for driving the motor vehicle, a converter via which high-voltage direct current voltage provided by the high-voltage battery is convertible into high-voltage alternating current voltage for operating the electric machine, and a charging connection for providing electrical energy for charging the high-voltage battery. The converter is a three-stage converter having a first switch unit which is assigned to a first phase of the electric machine. The first switch unit has two switch groups connected in series which each have two insulated-gate bipolar transistors (IGBTs) connected in series, where a connection is disposed between the IGBTs of one of the two switch groups, which connection is electrically connected directly to a line of the charging connection.
Running shoes were categorized either as motion control, cushioned, or minimal footwear in the past. Today, these categories blur and are not as clearly defined. Moreover, with the advances in manufacturing processes, it is possible to create individualized running shoes that incorporate features that meet individual biomechanical and experiential needs. However, specific ways to individualize footwear to reduce individual injury risk are poorly understood. Therefore, the purpose of this scoping review was to provide an overview of (1) footwear design features that have the potential for individualization; (2) human biomechanical variability as a theoretical foundation for individualization; (3) the literature on the differential responses to footwear design features between selected groups of individuals. These purposes focus exclusively on reducing running-related risk factors for overuse injuries. We included studies in the English language on adults that analyzed: (1) potential interaction effects between footwear design features and subgroups of runners or covariates (e.g., age, gender) for running-related biomechanical risk factors or injury incidences; (2) footwear perception for a systematically modified footwear design feature. Most of the included articles (n = 107) analyzed male runners. Several footwear design features (e.g., midsole characteristics, upper, outsole profile) show potential for individualization. However, the overall body of literature addressing individualized footwear solutions and the potential to reduce biomechanical risk factors is limited. Future studies should leverage more extensive data collections considering relevant covariates and subgroups while systematically modifying isolated footwear design features to inform footwear individualization.
Featherweight Generic Go (FGG) is a minimal core calculus modeling the essential features of the programming language Go. It includes support for overloaded methods, interface types, structural subtyping and generics. The most straightforward semantic description of the dynamic behavior of FGG programs is to resolve method calls based on runtime type information of the receiver.
This article shows a different approach by defining a type-directed translation from FGG to an untyped lambda-calculus. The translation of an FGG program provides evidence for the availability of methods as additional dictionary parameters, similar to the dictionary-passing approach known from Haskell type classes. Then, method calls can be resolved by a simple lookup of the method definition in the dictionary.
Every program in the image of the translation has the same dynamic semantics as its source FGG program. The proof of this result is based on a syntactic, step-indexed logical relation. The step-index ensures a well-founded definition of the relation in the presence of recursive interface types and recursive methods.
The sharp rise in electricity and oil prices due to the war in Ukraine has caused fluctuations in the results of the previous study about the economic analysis of electric buses. This paper shows how the increase in fuel prices affects the implementation of electric buses. This publication is constructing the Total Cost of Ownership (TCO) model in the small-mid-size city, Offenburg for the transition to electric buses. The future development of costs is estimated and a projection based on learning curves will be carried out. This study intends to introduce a new future prospect by presenting the latest data based on previous research. Through the new TCO result, the cost differences between the existing diesel bus and the electric bus are updated, and also the future prospects for the economic feasibility of the electric bus in a small and midsize city are presented.
The importance of machine learning has been increasing dramatically for years. From assistance systems to production optimisation to support the health sector, almost every area of daily life and industry comes into contact with machine learning. Besides all the benefits that ML brings, the lack of transparency and the difficulty in creating traceability pose major risks. While there are solutions that make the training of machine learning models more transparent, traceability is still a major challenge. Ensuring the identity of a model is another challenge. Unnoticed modification of a model is also a danger when using ML. One solution is to create an ML birth certificate and an ML family tree secured by blockchain technology. Important information about training and changes to the model through retraining can be stored in a blockchain and accessed by any user to create more security and traceability about an ML model.
Narrowband Internet-of-Things (NB-IoT) is a 3rd generation partnership project (3GPP) standardized cellular technology, adopted for 5G and optimized for massive Machine Type Communication (mMTC). Applications are anticipated around infrastructure monitoring, asset management, smart city and smart energy applications. In this paper, we evaluate the suitability of NB-IoT for private (campus) networks in industrial environments, including complex cloud-based applications around process automation. An end-to-end system has been developed, comprising of a sensor unit connected to a NB-IoT modem, a base station (gNodeB) equipped with a beamforming array and a local (private) network architecture comprising a sensor management system in the edge cloud. The experimental study includes field tests in realistic industrial environments with latency, reliability and coverage measurements. The results show a good suitability of NB-IoT for process automation with high scalability, low-power requirements and moderate latency requirements.
The identification of vulnerabilities is an important element in the software development life cycle to ensure the security of software. While vulnerability identification based on the source code is a well studied field, the identification of vulnerabilities on basis of a binary executable without the corresponding source code is more challenging. Recent research has shown, how such detection can be achieved by deep learning methods. However, that particular approach is limited to the identification of only 4 types of vulnerabilities. Subsequently, we analyze to what extent we could cover the identification of a larger variety of vulnerabilities. Therefore, a supervised deep learning approach using recurrent neural networks for the application of vulnerability detection based on binary executables is used. The underlying basis is a dataset with 50,651 samples of vulnerable code in the form of a standardized LLVM Intermediate Representation. The vectorised features of a Word2Vec model are used to train different variations of three basic architectures of recurrent neural networks (GRU, LSTM, SRNN). A binary classification was established for detecting the presence of an arbitrary vulnerability, and a multi-class model was trained for the identification of the exact vulnerability, which achieved an out-of-sample accuracy of 88% and 77%, respectively. Differences in the detection of different vulnerabilities were also observed, with non-vulnerable samples being detected with a particularly high precision of over 98%. Thus, the methodology presented allows an accurate detection of 23 (compared to 4) vulnerabilities.
Due to the Covid-19 pandemic, the RoboCup WorldCup 2021 was held completely remotely. For this competition the Webots simulator (https://cyberbotics.com/) was used, so all teams needed to transfer their robot to the simulation. This paper describes our experiences during this process as well as a genetic learning approach to improve our walk engine to allow a more stable and faster movement in the simulation. Therefore we used a docker setup to scale easily. The resulting movement was one of the outstanding features that finally led to the championship title.
This review provides an overview on the production and analysis techniques of antioxidative peptides from food proteins. Regarding the production of antioxidative peptides, interlinked factors must be considered. Depending on the protein substrate, different peptidases or peptidase systems containing multiple enzymes as well as a specific production process must be chosen. The antioxidative peptides might be produced in a batch process including multiple pre- and post-treatments, besides the hydrolyses with peptidases itself. As an alternative, the potential of continuous production systems is discussed in this review. Furthermore, robust analyses tools are needed to gain control of the process and final product properties. With no standardized methodology available for antioxidative peptide evaluation, pros and cons of various strategies for peptide separation and antioxidative measurement are discussed in this review. Therefore, this review provides a roadmap for antioxidative peptide generation from various sources for research and development as well as for potential industrial use.
Gas Analysis and Optimization of Debinding and Sintering Processes for Metallic Binder-Based AM*
(2022)
Binder-based additive manufacturing processes for metallic
AM components in a wide range of applications usually use
organic binders and process-related additives that must be
thermally removed before sintering. Debinding processes are
typically parameterized empirically and thus far from the optimum.
Since debinding based on thermal decomposition processes
of organic components and the subsequent thermochemical
reactions between process atmosphere and metal
powder materials make uncomplicated parameterization difficult,
in-situ instrumentation was introduced at Fraunhofer
IFAM. This measurement method relies on infrared spectroscopy
and mass spectrometry in various furnace concepts to
understand the gas processes of decomposition of organic
components and the subsequent thermochemical reactions
between the carrier gas atmosphere and the metal part, as well
as their kinetics. This method enables an efficient optimization
of the temperature-time profiles and the required atmosphere
composition to realize dense AM components with low contamination.
In the paper, the optimization strategy is presented,
and the achievable properties are illustrated using a fused
filament fabrication (FFF) component example made of 316L
stainless steel.
Synthesizing voice with the help of machine learning techniques has made rapid progress over the last years. Given the current increase in using conferencing tools for online teaching, we question just how easy (i.e. needed data, hardware, skill set) it would be to create a convincing voice fake. We analyse how much training data a participant (e.g. a student) would actually need to fake another participants voice (e.g. a professor). We provide an analysis of the existing state of the art in creating voice deep fakes and align the identified as well as our own optimization techniques in the context of two different voice data sets. A user study with more than 100 participants shows how difficult it is to identify real and fake voice (on avg. only 37% can recognize a professor’s fake voice). From a longer-term societal perspective such voice deep fakes may lead to a disbelief by default.
In the literature, many studies have described the 3D printing of ceramic-based scaffolds (e.g., printing with calcium phosphate cement) in the form of linear structures with layer rotations of 90°, although no right angles can be found in the human body. Therefore, this work focuses on the adaptation of biological shapes, including a layer rotation of only 1°. Sample shapes were printed with calcium phosphate cement using a 3D Bioplotter from EnvisionTec. Both straight and wavy spokes were printed in a round structure with 12 layers. Depending on the strand diameter (200 and 250 µm needle inner diameter) and strand arrangement, maximum failure loads of 444.86 ± 169.39 N for samples without subsequent setting in PBS up to 1280.88 ± 538.66 N after setting in PBS could be achieved.