Harmonic Analysis and Figured Bass with Finale 2011

Recently started studying music theory at a community college.  As a classically-trained musician the class fills in so many pieces.  For years I would play classical pieces and sort of feel what was going on.  Based on intuition, common sense, and deduction I sort of knew what was happening behind the scenes, or at least knew that, behind the scenes, certain things were going on.  Now I’m actually studying those things in detail and its very rewarding.

Mostly all of our work is done by pen (actually pencil) and paper, but I’ve been interested in using what is considered one of the top music notation software applications, Finale, to try to work with scores.  I suppose “daunting” begins to describe using an application like it.  But actually I’ve found that just knowing how to do several key things makes a big difference.

One major aspect of what we’re doing in class is analyzing music, determining what chords a work comprises.  Since chords can be inverted, we also need to take note of what type of inversion each chord is.

Many of our exercises require us to mark the symbols for chords beneath notes along with figured bass symbols.

I was able to obtain a copy of Finale 2011 however as far as I could tell it lacks the ability to create these notations.  On the Finale website the makers of the program tout that the new version 2012 has a new font, Finale Numerics font, which is designed for such notation.

When I came across the above page I basically began salivating.  The two links off that page for harmonic analysis and figured bass are exactly what we’re doing in class!

I then began researching this.  Someone in a forum suggested finding a way to get a copy of the Finale Numerics font and simply copy it onto a system running Finale 2011.  Easier said than done, until I found a trick:

Anyone can download and install Finale Notepad which is basically a viewer which can view scores created with Finale.  The trick is to install it and then (under Windows) go to Settings -> Control Panel -> Fonts and look for “Finale Numerics Regular” in the list of fonts.  Right-click on it and select “Copy”.

Now, on any system you have Finale 2011 installed on, copy that file, right-click on it and select “Install”.

Voila!  You now have the Finale Numeric font available for harmonic analysis and figured bass notation with Finale 2011.

For convenience’ sake here is the actual font.  (File has been zipped for security reasons.  You will need to unzip it with 7-zip before installing it.)

Very useful astronomical calendar

Mozilla Lightning is a calendar plugin for Mozilla Thunderbird e-mail client.  With Lightning it is easy to add network-based calendars very easily.  this calendar has important astronomical dates and is very useful: http://cantonbecker.com/astronomy-calendar/

I’ve seen other calendars for astronomical dates however they had problems.  One main problem is that they don’t have the exact times for major astronomical events, such as solstices and equinoxes.  Even worse, with some the actual date of astronomical events can be wrong because if the time of the event does not get converted from Universal Coordinated Time to your local timezone, it can then show up on the wrong day.

But this calendar listed above works perfectly, lists equinox and solstice events with the correctly converted local times and dates, as well as Mercury retrograde dates and solar eclipses.

Shopping bag resource consumption comparision


Article the BBC had about plastic bags.  Reusing the type of plastic bags which are now banned in many places in fact is far less resource consumptive than alternative types of bags like paper or cotton.

But the bags are getting banned because apparently most people don’t reuse them and also apparently large amounts of them end up getting littered.  Its inconceivable to me how anyone could litter a plastic bag.  Maybe the real problem is too many humans?

Hexagonally-based patterns of phosphorylated tubulins in microtubules

This is one of those articles about a research breakthrough which kind of just goes by unnoticed for the most part, yet its implications are profound:

Scientists Claim Brain Memory Code Cracked

In an article in the March 8 issue of the journal PLoS Computational Biology, physicists Travis Craddock and Jack Tuszynski of the University of Alberta, and anesthesiologist Stuart Hameroff of the University of Arizona demonstrate a plausible mechanism for encoding synaptic memory in microtubules, major components of the structural cytoskeleton within neurons.

Interesting fact about dental amalgam fillings

Recently I had an amalgam filling done for one of my molars.  This particular tooth has had many issues in the past with composite fillings.  Because of the location and nature of the filling, the composite fillings basically never really held up very well.  It seemed like I was getting the filling done almost every year or so.

This most recent time I was at the dentist I decided it was enough.  Many times in the past she actually had advised me about using an amalgam filling instead but, because of a lot of negative things I had heard and read about amalgam fillings, I ruled it out.

But going through problem after problem with the tooth, and enduring almost continual discomfort with it, I was getting tired of it.  The day I was at the dentist’s office she had drilled my tooth and it was prepared for yet another composite filling.  She was of course not particularly happy to have to be redoing the filling yet again, and neither was I really.

I was laying back in the chair and she was just about to etch the tooth in preparation for the composite filling when I put my hand up in the air and muttered as best I could that I wanted an amalgam filling.  She almost seemed stunned and repeated the question back to me to make sure she understood.  I answered “Uhh huh” with my mouth pried open, and she understood that that was what I wanted.

After doing  my own reading on the topic, it turns out that in fact for the type of filling I was having done amalgam is by far the best choice.  After having it done I actually felt kind of bad for having put my dentist through all the redo’s of that filling over the years.

I just read today that the type of mercury in amalgam fillings is elemental mercury – similar to quicksilver.  According to Wikipedia:

Quicksilver (liquid metallic mercury) is poorly absorbed by ingestion and skin contact. It is hazardous due to its potential to release mercury vapor. Animal data indicate that less than 0.01% of ingested mercury is absorbed through the intact gastrointestinal tract; though it may not be true for individuals suffering from ileus. Cases of systemic toxicity from accidental swallowing are rare, and attempted suicide via intravenous injection does not appear to result in systemic toxicity.

So it turns out that the type of mercury in amalgam fillings is in fact very different from the kind that everyone worries about in seafood, and far less toxic to the body.  Even when ingested it is essentially passed through the body.

I thought this was interesting information because there may be situations similar to mine when really an amalgam filling is much more suitable, namely chewing surfaces of teeth where cosmetic considerations are minimal and long-term stability of the filling and its cohesion to the tooth and resistance to decay are of utmost importance.

I will note one further thing, even composite fillings are toxic.  They simply use different chemicals.  In fact no dental filling is non-toxic.  Its a risk/benefit thing.  In my case the actual chemicals used to treat the tooth for the filling caused significant irritation to the nerve of the tooth.  This, along with the fact that the seal is not as tight as an amalgam filling led to more-or-less ongoing discomfort with the tooth in question.  There is no question in my case that amalgam was the right choice.

There is a lot of negative information out there about amalgam fillings.  I’ve even met people who had all their dental work redone – all their fillings removed and redone with composite – because they believed it was toxic.  I’m not going to state definitively anything about another’s health choices or beliefs, but I personally seriously question such behavior in light of what is known about mercury and the human body.

Isotopic Analysis of Ancient Rock Tells Deep Story of Early Earth

Geochemistry is I believe one of the most vastly under-appreciated areas of scientific research. With modern techniques available for measuring extremely tiny quantities of rock and mineral material to extremely high precision, details are emerging about how Planet Earth formed which were never known before.

I am just reading about a new study which examined 2.8 billion year old volcanic rocks from Russia called komatiites. The isotopic signal of the element tungsten in this rock is distinctive from other Earth rocks and the researchers believe that this rock comes from the very earliest period of Earth’s formation, possibly within only 10-20 million years of the formation of the Solar System itself.

It had previously been believed, the article explains, that the earliest rocks of Earth would have eventually mixed with other rocks which Earth acquired through ongoing impacts with ever larger bodies, as its mass increased substantially from the earliest times. But this latest finding implies that a portion of the earliest rock did not get mixed in with later-acquired rock, but remained distinct.

From the article:

“This difference in isotopic composition requires that the early Earth formed and separated into its current metallic core, silicate mantle, and perhaps crust, well within the first 60 million years after the beginning of our 4.57-billion-year-old Solar System,” says Touboul.

“In itself this is not new,” he says, “but what is new and surprising is that a portion of the growing Earth developed the unusual chemical characteristics that could lead to the enrichment in 182-tungsten; that this portion survived the cataclysmic impact that created our moon; and that it remained distinct from the rest of the mantle until internal heat melted the mantle and transported some of this material to the surface 2.8 billion years ago, allowing us to sample it today.”

 

Source: Building Blocks of Early Earth Survived Collision that Created Moon

If I was president I would have guys like this come to places like under the dome at San Francisco Center and give lectures on this stuff.  How can people not be fascinated by it?  I would also have people from diverse fields like geochemistry, cosmology, astrophysics, paleobotany, evolutionary biology, etc. come together for symposiums and produce a book series which characterizes the Universe from the Big Bang, through the development of Earth and the evolution of life, to the stages of human history leading to the present.  It could even start out (or end might be more appropriate) with speculation on cyclic cosmology or what was before the Big Bang, continue through the evolution of life, going through human development and culture.  I would then also make a video series based on the book.

In fact I might make one or several episodes just devoted to geochemistry and the people who do isotopic analysis, have extensive interviews with them, survey the equipment and techniques that get used for analysis, and get into detail about how and where they locate samples.

A lost world 8,000 years ago

Today, the 11-month anniversary of the Japan 3/11 tsunami, I was reading about historical tsunami which is a fascinating subject area.  This led me to learn about these mega-enormous landslides called the Storegga Slides which occurred around 8,000 years ago in the area of the North Sea and resulted in the submerging of a huge landmass which once connected Great Britain with the landmass of Europe.

Watching a documentary video of it now.
(hint: Jump to late ep. 2/7 to see a hot, Danish archaeologist with a great tan and no shirt.  Jump to ep. 3/7 to see that Mesolithic peoples were avid seafood gatherers.  Ep. 6/7 gives speculation on transition from Mesolithic seafood gathering to Neolithic agrarian society.)

How many articles?

All the time I read articles about how being healthy is healthy.  As if a person doesn’t know what is healthy.  Not being healthy is something to do.  Instead of just being healthy one has to educate oneself, to learn what is healthy, and put it into practice.  There’s so much more to do it seems when we are not healthy.  I guess being healthy is boring.  No?

Well I sincerely hope not, but that is the spin I get from it.  I read day in and day out about how one practice or another, one healthy thing – eating a certain food, engaging in a certain practice, etc. – has benefits.

What are we?  Does it not stand to reason that, as a human being – as a living organism which lives in a world, has a body, has a mind, can distinguish between things, and has an inherent interest in maintaining its well-being – that we already know what the most healthy things are?

If that were not the case it would be very strange.  If we were some type of being which had all the attributes mentioned above and were not inherently capable of knowing what is healthy, what is good for us, it would be an anomaly.

For this particular instance I’m reading an article about how diet affects brain health, how eating a good diet essentially protects our brains, while eating a poor diet harms them.

How many articles must one read before one simply understands that being healthy is healthy?  How much affirmation does one need?

I want to stop reading these articles.  What benefit is there from reading them?  I think that I should already know how to choose in my life and have cultivated the skill and discipline to make the right choices, at least for very essential things like the food I put into my mouth every day.  I should believe in myself.  Not believing in myself is not healthy.

 

Re: Getting away from Free Web Services

I wanted to revisit what I discussed in a recent post “I highly recommend getting away from all these “services” like Google and Yahoo“.

I still recommend that it is best to try to get several things on your own:

1.  Domain name

2.  E-mail account

3.  Web server

Getting a domain name costs about $15 per year.  Owning your own domain, like xyz.com is nice but its not actually useful until you host the domain somewhere.  Fortunately there are a lot of great hosting providers out there with great plans.  I used Dreamhost for a long time and recently switched to Hostgator for better performance.  Both are good and there are many others.

More than likely if you go with any hosting provider you are going to want what is called a Shared Hosting Plan.  Basically with a plan such as this you are allocated an account on a server and have the use of a web server, database server, and a lot of cool tools such as analysis and statistics of your web sites.  I said web sites – plural – because, yes, if you have your own domain you can in fact have as many sites as you want.  You can have love.xyz.com, wonderful.xyz.com, etc.  Those can all be separate sites if you want.  Or you can just have one main site like xyz.com.

It may seem overwhelming that there are things like database available.  In fact if you use the automatic installlation tools to create web sites that the hosting providers give customers via a web interface, you actually won’t need to worry about this too much.  If you set up WordPress for example (what this site runs) you can just do what is called a “one click install” and most of the backend setup will automatically be done.

In fact I recommend WordPress because it is highly configurable.  There are hundreds if not thousands of plugins for it which can add features and functionality.  You can create a YouTube-like site with it, or a more basic blog like this one.  You can add social-networking features to interconnect with Facebook or other protocols.

I recommend this because it gets you away from relying on the big companies for all your Internet activity.  That is the main point.  It is very unsettling that people are flocking to these huge web service companies like Google and Yahoo for all their critical Internet activities such as posting information, networking, and communications.

The original creators of the Internet were visionaries who fortunately for our sakes were not thinking of cashing in on and making insane profits from exploiting what can be considered functional communication and information-exchange niches.  The Internet, being an extremely new technology, has been full of this vast potention in terms of how people communicate and process information.  The original Internet founders began establishing open protocols to enable systems around the world to exchange information in as convenient and open a way as possible.

They did not try to make e-mail proprietary and copyright all the technology behind it in order to make insane profits.  Instead they took some of the brightest minds in information technology from places like leading academic institutions and created open protocols and standards which determine how all e-mail communications should take place.

What this did was enable anyone, anywhere to create an e-mail application which used these open protocols to communicate with any other e-mail application that anyone else created, and be able to send, process, and receive e-mails.  Without standards like this the backbone of the Internet, which we often take for granted, would not exist.

In order for information to be transmitted from one computer somewhere in the world to your computer, a whole lot of things need to happen.  A huge portion of that happens because of the protocols and standards laid down by the first generation of Internet pioneers.

The thinking in today’s world is unfortunately quite different than it was in the beginning.  Nowadays companies are formed with the exclusive intent on finding some novel area of information exchange, communication, or web functionality that no one else has developed yet, and trying to capitalize on and corner it as a market, to “own” as much of it as possible and prevent anyone else from being able to do so later.

If you really think about it, the new form of thinking is actually quite sickening.  In fact I’ve been able to work in companies both with the old form of thinking and the new form.  I can say quite decidedly that the work environments in the companies with the “new” form of thinking is pretty disgusting.  You see a lot of young punks who seem to have little true love for information technology as a science who just want to get rich quick and cash in when their startup goes IPO.

This is a far cry from companies where true technologists who love information science work because they are inspired and have a greater vision of technology.

Every time people cave in and use a lot of these “free” Internet services provided by all these startup companies, they are in a sense destroying the world of the greater vision of information technology in favor of a craven, shallow, and rather greedy view that is based on making profit.

I believe its really important to get away from companies like Facebook.  While the company may be valued on the stock exchange as being worth billions of dollars, to me it is actually worth far less than nothing.  A company like Facebook is damaging and harmful to the world of information technology.  It capitalizes on catering to people’s laziness (and often their ignorance).  Its like spoonfeeding the masses who become addicted to being spoonfed, which create a whole market of companies clamoring over each other for a piece of the spoonfeed-pie profit.  But this is not true profit, in the orginal sense of the world of being a progress or advancement.  It is a short-sighted ripoff profit which sells out the future at the expense of the present.

 

Ok, getting back to the original discussion, I want to mention one thing – if you get your own domain name you don’t necessarily have to purchase your own hosting plan with a hosting provider.  Most plans allow multiple if not unlimited domains.  Therefore you can split the cost of a plan with several people resulting in an annual cost that is trivial.

Once you’ve got a domain set up with a hosting provider you can start setting up your e-mail, web sites, etc.  From that point on it should be easy (and fun).

I mentioned WordPress but there are many other types of web applications that are available.  The options are almost unlimited.  And you can of course create your own web sites entirely from scratch if you are up to the challenge.

You can also use a program like Microsoft Expression Web to create a web site based on a template or from scratch.  The possibilities are endless.

We live in a time when the privacy implications of many things are quite troubling.  For example, the simple task of having contacts on a mobile phone backed up and sychronized with a desktop mail/contacts program often involves having to use a service like Google Mail.  But do you really want to give a huge company access to all your e-mails and contacts?  Even if they claim they only use the information for marketing, that’s still quite distasteful.  In my opinion, even though these gigillion-dollar Internet behemoths have come into existence, these models are not the best ones to follow.  There are much better ones.  The absolute best one is that based on total freedom, open standards, and Open Source software.  It does not require revenue for advertising because, frankly I’d rather pay $30 per year for a domain and shared hosting and have total freedom.

If everyone followed this model we actually could have an Internet that is free of all the hyper-marketing, advertising.  This model needs to extend into the hardware realm.  We all see now the disgusting wars going on between the huge technology device behemoths as they sue the crap out of each other and attempt to impede and obstruct each other, while claming more and more patents which are frivolous and destructive.

We need to switch to a model, as in the early days of the Internet, that is guided by inspiration and passion, not greed and gluttony.

Connecting to a Juniper VPN via the command-line in Debian

Intro:

You want to connect your Linux box running Debian to a Juniper VPN using the command-line only. (You don’t want to run X on your machine nor necessarily install Java to run the graphical client). Its possible to do with the ncsvc command-line app provided by Juniper.

Strangely, the command-line app is contained in a .jar file on the VPN server. But you can download this and extract it.

You will also need to download the certificate from the VPN server and process it to .der format.

Note that there are other ways of doing this. I like the idea of using a SecureID as a credential. This process outlined below is only for using a username/password.

Download the client:

If your Juniper VPN host is e.g. sslvpn.abc.com, from any web browser go to:

sslvpn.abc.com/dana-cached/nc/ncLinuxApp.jar

Download ncLinuxApp.jar and upload the file to your home directory on the Debian machine.

Install the client:

On your Debian machine:

mkdir -p ~/.juniper_networks/network_connect/
unzip ncLinuxApp.jar -d ~/.juniper_networks/network_connect/
sudo chown root:root ~/.juniper_networks/network_connect/ncsvc
sudo chmod 6711 ~/.juniper_networks/network_connect/ncsvc
chmod 744 ~/.juniper_networks/network_connect/ncdiag

The client is a 32-bit app. If you’re running 64-bit Linux then you need to install:

libc6-i386 lib32z1 lib32nss-mdns

Set up the certificate:

I had some problems with the next step:

sh getx509certificate.sh sslvpn.abc.com network_connect/sslvpn.abc.com.der

The exact error was:

Connecting to sslvpn.abc.com port 443
Generating Certificategetx509certificate.sh: 18: let: not found
getx509certificate.sh: 19: let: not found
error

But it did create the two files out.txt and out1.txt which is all that’s needed.  Here’s what I did:

cp out.txt cert.txt

edit cert.txt and remove everything except the certificate and the “BEGIN CERTIFICATE” and “END CERTIFICATE” lines.  In other words, your cert.txt file should like exactly like below:

-----BEGIN CERTIFICATE-----
[a lot of randome characters]
-----END CERTIFICATE-----

Then:

openssl x509 -in cert.txt -outform der -out network_connect/sslvpn.abc.com.der

Connecting:

./ncsvc -h sslvpn.abc.com -u username -f sslvpn.abc.com.der

The screen should say:

Connecting to sslvpn.abc.com : 443

If you run the command route -n you should see the routes set up by the Juniper VPN.

To Do:

Copy the ncsvc binary to /usr/local/bin, create a proper /etc directory for juniper_networks and store the key there, and copy /etc/init.d/skel to create your own custom startup script.

I actually did this and ran into one hitch: the ncsvc command does not return to prompt when it is invoked.  To get around this I had to create /usr/local/bin/ncsvc.wrapper which runs the following:

ncsvc$@ &

I then call the wrapper from inside the init.d script in place of ncsvc.

Credit:

Pretty much all of this was taken from these first two sites:

http://www.rz.uni-karlsruhe.de/~iwr91/juniper/
http://www.entropy.ch/blog/Mac+OS+X/2007/07/28/Juniper-Network-Connect-SSL-VPN-and-Virtualization.html

http://kb.juniper.net/InfoCenter/index?page=content&id=KB16188

Page 91 of 97« First...102030...8990919293...Last »