1 of 1 people found this helpful
this has been done with a lot of embedded type devices already, look up about e.g. mesh potato (low power telecom / wireless mesh bubble) - I love building remote/standalone stations with those things - http://developer.servalproject.org/dokuwiki/doku.php?id=content:meshextender:solar_operation
(you can see it already gets massive, they are using 12V 30aH battery and 40W solar panel, and meshpotato is not a device that uses much power)
it's really just all arithmetic...
- how many hours per day will the edison need to be on battery
- how much power does it draw per hour
--> gives you what amp-hour battery you need
(and you probably really don't want to be draining the batter full-dead overnight every time - you really don't want it getting less than 50% or maybe even not that low)
- typical retail panels usually end up at about 17-18V in full sun (barely squeaking by if directly connected to J21, and I have no idea if it would be tolerated if the panel happened to spike at say 19V or 20V - open-circuit they can max more like 22-23V, even.)
(so you might want a charge controller simply to drop the "load" (aka edison) side down to 12V. inexpensive controllers are rated as low as 5A, 7A, 10A. also if you use a charge controller, you could completely skip the edison J2 minimalist battery specs, and connect a big honking SLA/glass-mat 12V battery to the charge controller itself)
if you are building your own panel from components, obvs you have direct control over the output voltage, plus you could put DC regulation etc.)
- what actual wattage output do you need to:
-- a. power the edison? (obvs different given how it's being used, etc.)
-- b. charge the battery at a fast enough rate to recover from the overnight loss
an inexpensive 20w panel might give a bit over 1A at best component quality, and the output voltage is what actually varies. if you are getting a lot of sun, 1A @ 18V will give you 18 watts. poor sun maybe 11-12 watts or less. naturally there are losses around every corner, too.
do the math, e.g. if your edison only ever uses, say, 800ma @ 5V (call it 400ma @12-13V), does the remaining power from the solar provide enough charge to top off your battery. etc. etc. (and does it provide it fast enough over the hours the sun is shining)
and of course start to increase all specs because your geography will factor in how many hours of sun you get, how useful each hour of sun actually is, etc.
If you don't mind me bouncing my understandings and you correcting my misconceptions
Can you tell me what the 15% might be referring to, and how it plays into the equations?
> 4 or 6 panels: 3"x6" solar cell panels - 15% efficiency, 0.5A x 3.6V 1.8W each panel.
So when we look at only 4 or 6 of these DIY 3"x6" cells, it sounds like they wouldn't come close?
I was originally thinking DIY to come closer to the 5v @ 1A (assuming an unknown 800mA draw giving 200mA over Edison's module max of 600mA and continuous 24h). was hoping to keep the size (in area) down with 4 to 6 panels, but if I understand correctly it is the "way over the top spike" in Watts needed during the limited 3 hours of average daylight I have in my region that keeps my battery topped off -- that ultimately will determine how many panels I'd need?
I have more reading to do ...
For the Intel Guys .. is there an upper limit on the charging circuit that I need to be aware of?
1 of 1 people found this helpful
I calculated how large panel is needed to power a 4W load (5V at 800mA for example) and came to the conclusion that the minimum panel size is 30W at my location.
I live in Fremont, California, where the average monthly insolation in kWh/m2/day is relatively good. There is a very large variation depending on where you live.
the table below shows the average per month for a solar panel facing south and with optimal tilt. (values in brackets is for the same panel in horizontal position). last row shows average daily insolation for the year.
Jan 4.07 (2.39)
Feb 4.67 (3.26)
Mar 5.7 (4.7)
Apr 6.71 (6.31)
May 7.23 (7.3)
Jun 7.83 (7.82)
Jul 7.22 (7.41)
Aug 6.89 (6.66)
Sep 6.3 (5.45)
Oct 5.85 (4.14)
Nov 4.67 (2.79)
Dec 3.91 (2.14)
average 5.92 (5.03)
In December I should get an average of 3910Wh * 0.192*0.15*0.85=96Wh per day.
0.192 is the panel area in m2
0.15 is the panel efficiency, 15% is a typical number
0.85 is the charge efficiency covering losses in charger and battery.
96Wh/12V=8Ah this is how much charge I get from the panel, on average.
96Wh/24h = 4W. this would be the maximum continuous load for that panel, provided the battery is large enough.
1 of 1 people found this helpful
the efficiency regards how good each cell is at doing its job, i.e converting - in this case the energy of your electrical output is 15% of the energy of the ambient radiation/light-pressure.
things like dirty glass would be regarded as affecting efficiency.
things like poor sun would not be. (i.e. you're getting 15% of the energy-value of the light, regardless if it's good light or poor light)
normally you would pick your desired voltage and amperage, and wire up sets of single panels in series, and then wire those sets in parallel (or you could do the opposite way parallel-then-serial; one of them is supposed to be "better" but I forget which & why)
e.g. put 3 of those 0.5A x 3.6V panels in series and you have 0.5A @ 10.8V. repeat with 3 more. connect the 2 groups in parallel and you now have 1A @ 10.8V
(this is seriously oversimplifying and ignoring the fact that you probably should have wired in diodes at certain points, and of course real-world losses not to mention actual electrical laws, but you get the idea)
so, your 6-cell array, in the best possible conditions, outputs 10.8V 1A: 10.8W. (obvs there are choices in wiring that you can get any combination of V and A you want, but in the end it's going to be 10.8W for those particular 6 cells.)
since you are using solar, and the Edison J21 wants 7V or higher, even theoretical max of 10.8V is pushing it if the sky turns cloudy, your output could drop below 7. so maybe wire 8 cells for 14.4V, etc.
anyway, if the Edison is consuming 600ma @ 5V, that's 3W. so you have 7W (at most) to charge your LiPo.
if you are running from battery for 20 hours a day, that 3W is equivalent to something like 800ma @ lipo being ~3.7-4V ...?
so that's something like a 16Ah 3.7V battery? or a 5Ah 12V? (which in any case should be doubled or more, in practice)
which a ~7W source is not going to charge in 3 hours. it sounds crazy-high, maybe my math is wrong.
I feel like I made a mistake somewhere, ha - my phone only has a 2200mAh battery @ 3.8V, and that thing lasts for 7 days (but 90% of the time it is asleep with all radios turned off, so...)
(and, again, this is even with faking/ignoring losses and electrical laws)
lovely, mind-numbing, head-aching arithmetic.
still, what I would do is get a big 12V marine gel battery, a 5A or 10A charge controller, and a 40W/12V panel. that's probably about USD200 total.
ignore lipo*, ignore the Edison charging capability.
(you can totally get stacks and stacks of inexpensive cells on ebay and solder up your own panel, that part is fun and cheap!)
* although... if you want to source some reputable/quality LiFePO4 cells, building up your own battery is inexpensive and not difficult.
not sure if the Edison charger can properly deal with lifepo, but some of the retail solar charge controller modules can.
1 of 1 people found this helpful
A simpler way to do the calculation is to just look up the number for the average kWh per square meter of panel area per day using the online database I linked to ( 3.91 in my example) and multiply it with the nominal panel power (for example 30W) and this gives the number of Watthours that the panel will provide. This way you don't need to worry about panel area or panel efficiency. It works because the nominal panel power 30W is measured under standard test conditions which is 1000W/m2 and with panel temperature +25C.
Note that there are some variations in real life that can greatly affect the accuracy of the calculations. For example, the number of watt hours is affected by panel temperature. Usually a reduction of 0.45% per degree of temperature rise above +25C. so if the panel is located in a hot climate, you will get less solar energy from the panel than expected. On the other hand, if you are in a cold climate you will get more.
Power losses in cables and in the charge controller, tracking efficiency of the charge controller and the charge efficiency of the battery will reduce the number of watt hours that is available for powering the load.
a good charge controller uses Maximum Power Point Tracking to utilize as much as possible of available panel power. a sophisticated charger such as the LT8490 has a tracking efficiency around 99% and a power stage efficiency of 98%. cable losses will depend on wire size and length. battery charge efficiency is about 90% for a sealed lead acid battery. when you add up these losses you end up with the number 85% that I used in my calculation. I did not take panel temperature into account.
if you do not use a charger that has maximum power point tracking you will get much less power from the same panel. the most common type of charge controller just connects the panel to the battery and disconnects when battery voltage rises too high. then it cycles on and off. this wastes a lot of the available power because the optimum panel voltage is almost never the same as the battery voltage. you have to look at the power-voltage curve of a solar panel to see what I am talking about. the panel voltage variations with temperature also causes this cheap type of charger to have a really bad tracking efficiency. you may not worry about that if the solar panel is only used occasionally to charge a boat battery, but for a system that has to run continuously year around you need the more sophisticated charger type (MPPT). another advantage of the type of charger built with for example LT8490 controller is that the panel voltage does not have to be the same as the battery voltage. you can use a 36 cell ("12V") panel or a 48 cell or 60 cell or 72 cell panel to charge the same 12V battery. for northern locations you may want to consider a 200W panel or larger, and then you get more Watt per dollar with a 60 cell solar panel of the type that is used in large installations.
I think it makes sense to use 12V battery voltage for a system that powers the Edison by solar panel. The on board Li-Ion charger is not made for handling the maximum power from the panel. If the panel is delivering 30W or more you need a much bigger battery and a bigger charger..
Note that a battery must be used with a solar panel, to store energy of course, but also to avoid the large voltage variations that a solar panel gives. if you don't draw any current from the panel the voltage will rise to the open circuit voltage which can be high. a 36 cell panel can deliver up to 27V if it is cold.
it is also important to know that you cannot just attach a switching charge regulator to a solar panel and think that it will work. the solar panel is a current source, not a voltage source, and if you just connect a switching regulator ( of the type that is plugged into a cigarette lighter socket) to drop the voltage to 5V, the panel voltage can start collapsing and the system starts and stops continuously. the switcher must be specifically designed to operate with a solar panel.